Recently I was invited to speak at a meeting What Next After Coronavirus: The Good, The Bad the Ugly hosted by Kensal & Kilburn Better 2020. It was hosted by Melissa Benn, Ed Miliband, Christine Berry, Hilary Cottom and Anita Whittaker. You can see the video here: https://youtu.be/Le3LNa724JI

 

I think Arundhati Roy was correct in her framing of this moment. She said we should treat the pandemic as a portal to a new world and use this moment to think about what we would like to take with us and what we would like to leave behind.

As the chair of a digital rights organisation and writer on these topics, the biggest political threat of this virus is that the government will continue to inculcate a culture of surveillance. It was clear early on that technology would be deployed in response to the virus, and it does have a role to play. But a really ugly aspect of this pandemic has been the attempt to solidify the idea that we have a social duty to tolerate invasions of privacy and other human rights for the sake of public health. This takes the form of requiring us to download a contact tracing app or imposing facial recognition technology upon us in public spaces to allow things like body temperature checks.

It’s important to acknowledge that this kind of culture already exists. Routinely policy making on tech issues uses the cover of risks to health and safety to justify a rebalancing of power away from people, towards government. And this is often facilitated by bad faith media coverage, industry cheerleaders and weak parliamentary opposition. We need to find ways to change this mode of operating, which has been so successful for many politicians. I think the answer is to build movements that force policy makers to feel they are responsible – and their jobs are at risk – when they make decisions in this space that are not in the interests of the public.

What I think is positive about this moment is that people are actually starting to reject this mode of operating. They are refusing to accept this kind of political bargain where we must all give up privacy and autonomy for public safety. The experience of Covid-19 demonstrates how privacy and public health go together – that governments need to earn a social licence to use technology in particular ways.

The example that comes to mind is the Australian CovidSafe app (which is very similar to the UK proposal currently being developed). The app was widely sold as the only way out of lockdown (it was compared to sunscreen or a vaccine). Rumour has it that early proposals included much more ambitious attempts at making it into a system of digital identity, linking a person’s driver’s licence, welfare status, health care details etc to the app. This was defeated, which is a good thing. And with dismal take up rates of the app, it’s now almost never spoken about. The data it generates has been used on only a handful of occasions.

This is very similar to the UK, or possibly a future echo. The Australian app was deployed in May, the UK version is yet to be deployed. Yet the app has similarly been talked about as though it has magical qualities to stop the spread of the virus. And yet people increasingly realise other factors will be much more important to contact tracing efforts, including things like health infrastructure and social distancing.

This comes in the context of a wave of protests around police brutality and violence and an increase in far right organising. These protests provide very good examples of why people might be skeptical of contact tracing apps and facial recognition technology, which might be used against them to quash dissent.

A big reason that these project have not worked is that people do not trust the government. There were problems with the coding no doubt, but the fundamental problem was political, not technical. It’s no longer possible for policy makers to treat human rights as an afterthought in tech initiatives, or dismiss privacy issues as concerns only held by people that wear tin-foil hats. The deficit of public trust means these projects simply cannot succeed. That is, on the one hand, a good thing. But it’s also a reflection of failure on the part of our social democracy and the costs are borne by us.

So if we are to build trust, first, governments need to do the right thing, and stop constantly violating our privacy. Second, we need to build technology that is designed in ways that are decentralised, open source and prioritise accountability. These features need to be baked into the design process, not patched on later when the opportunity to gain public trust has been squandered. And if we can’t offer that, maybe we need a moratorium – a conclusion that IBM, Amazon, Microsoft have drawn in respect of facial recognition technology recently. These companies are now refusing to sell it to law enforcement. That’s a good start but perhaps it is time we take this ban further.

In short, technology will have a role to play in stopping the spread of this virus, but I think moment has presented a clear example of why prioritising privacy and other human rights in the online world is not counterposed to success of these projects, in fact cultivating a culture of privacy and respect for human rights will be the only way we can achieve it. This is the moment to agitate and organise around these issues because there are so many clear examples of why this ideas are not just true in theory, they are also true in practice.

So if the pandemic is a portal, we should leave behind the culture whereby we are encouraged to treat privacy as something we need to give up for public health and safety. We should take with us the idea that we always question governments and companies when they deploy technology for these purposes, because that is not a sign of paranoia or naysaying, it’s the sign of a healthy democracy.