Surveillance capitalism in the age of Covid-19

Could the Covid-19 coronavirus pandemic further consolidate surveillance capitalist practices and enterprises? Author Shoshana Zuboff warns Computer Weekly it is possible

When the dot com bubble burst in April 2000, the business environment of Silicon Valley went into a tailspin. Startups with massive valuations were suddenly shuttered, while shell-shocked venture capitalists began to panic about whether they would see returns on their investments in surviving businesses.

One such enterprise was Google, which was incorporated just two years before, and where revenues at the time primarily depended on licensing deals for web services.

Initially, founders Larry Page and Sergey Brin vehemently opposed the idea of advertising funded search engines, which they condemned as “inherently biased towards the advertisers and away from the needs of consumers”.

Shoshana Zuboff, author of The age of surveillance capitalism: The fight for a human future at the new frontier of power and a professor emerita at Harvard Business School, argues it was only when the bubble burst and pressure from investors began to mount that Google discovered its servers were full of behaviourally rich data.

“The data was considered just waste material. It was only in the heat of emergency they discovered these digital traces, which they had thought were worthless, were actually full of rich predictive signals,” Zuboff tells Computer Weekly.

Previously, data was only used to improve the quality of search results, but anything beyond the data needed for these improvements constituted a “surplus”, which signalled to Google how people were behaving.

This behavioural data surplus could then be used for targeting advertising to individual users, and allowed Google to make increasingly accurate predictions about how its users would behave.

This, in turn, improved the profitability of adverts for both Google and its business customers.

“It was on the back of those data streams – data that we did not even know that we were producing, and we did not know that they were taking – that they could create a trillion-dollar empire,” says Zuboff.

“The problem is that all this knowledge is about us, but it is not for us. It’s used to fabricate predictions of our behaviour that are sold to business customers”
Shoshana Zuboff, author and professor

“You take something from someone without their knowledge, claim you own it, and then use it to become unprecedentedly wealthy; lay that out for a child and they would say, ‘Oh, you mean they stole it’,” she says. “This is the original sin that made surveillance capitalism possible. The economics depend upon keeping data flowing. That’s why they don’t care if an ad, article or video is true or false, as long as it engages users and keeps their data supply chains full.”

Zuboff adds that we are now in a situation where surveillance capitalist enterprises, including Google, Amazon, Facebook, Microsoft and others, are sitting on “configurations of knowledge about individuals, groups and society that are unprecedented in human history”.

“The problem is that all this knowledge is about us, but it is not for us. It’s used to fabricate predictions of our behaviour that are sold to business customers. This began with online targeted ad markets, but now has migrated throughout the ‘normal’ economy,” she says.

“Instead of democratisation of knowledge, we have extreme asymmetry of knowledge. We thought we would now have access to proprietary knowledge, but just the opposite has happened – proprietary knowledge, owned and operated by these companies, now has complete access to us, without resistance or friction, because we don't even know it’s happening.”

It is this logic – the claiming of human experience as free raw material for translation into behavioural data, and in turn its commodification into "behavioural products" to be sold in "predictive futures markets" – that Zuboff says constitutes surveillance capitalism.

Far from being a passive development, Zuboff describes the practice of surveillance capitalism as “a direct assault on human autonomy”.

“That knowledge turns into power, just as in engineering monitoring capabilities lead to actuation capabilities – the more I know about you, the more I can intervene with your behaviour and shape it in ways that make it more predictable. These interventions are subtle and designed to bypass your awareness. It also means that this growing power is completely unaccountable,” she says, adding that without awareness, which is the basis of human autonomy, people cannot have meaningful agency or choice.

“If you don’t have autonomy and agency and so forth, forget about democracy. Surveillance capitalism is an economic logic that is fundamentally anti-democratic, both at the level of the individual and the larger societal framework – as democracy is weakened, surveillance capitalists aim to fill the void with their own form of ‘computational governance’ in which we are ruled by algorithms under the aegis of private capital.”

‘Business as usual’ for surveillance capitalists

Now, as the world grapples with a global pandemic and people are spending more time than ever living and working in the digital milieu, surveillance capitalists are poised to further expand their supply chains in what Zuboff calls an “extraordinary boon doggle”.

“While it is a crisis for all of us, it is something like business as usual for surveillance capitalists, in the sense that it is an opportunity to, possibly, significantly enhance their behavioural data supply chains,” she says, adding that Google has long been looking to establish a beachhead in health data (see box: Google’s push for health data), mainly through various partnerships and acquisitions.

Google’s push for health data

Google has previously landed in hot water over the transfer of medical data when, in 2017, the transfer of 1.6 million patient records from the Royal Free Hospital in London to its artificial intelligence firm DeepMind Health was found to have an “inappropriate legal basis”.

In November 2019, it was also reported by the Wall Street Journal that the personal medical data of more than 50 million Americans was being transferred from Ascension, the US’s second largest healthcare provider, to Google, where staff could easily access the full set of personal details.

The report noted that Amazon, Apple and Microsoft were also “aggressively pushing into healthcare”, though they were yet to strike deals of the same scale.

Even as recently as January 2020 it came to light that Google offered Cerner, a health data company responsible for one of the largest collections of global patient data, around $250m in discounts and incentives to store its data on Google servers.

However, the company backed out as it was not convinced that Google would not try to commercialise the data at some point down the line, as Facebook ended up doing with the WhatsApp data it had pledged to keep separate.

“For a company like Google, creating contact-tracing applications in partnership with Apple or the many other ways in which it wants to lend its resources for disease tracking and containment, the great likelihood is that these become institutionalised as new supply chains for Google,” says Zuboff, adding that a company’s past activity is a good indicator of how it will behave in the future.

“We know that the surveillance capitalist’s already historic lobbying machine has been out there using this crisis to try and consolidate huge gains for these companies, including trying to get California to postpone enforcement of its new state-wide privacy rules until 2021,” she says.

“We’ve even seen former Google CEO Eric Schmidt quoted as saying the coronavirus pandemic will make big tech even bigger. He’s not saying maybe, he’s saying it will – they are confident about this.”

Surveillance capitalism’s second state of exceptionalism

While the economic logic represented by surveillance capitalism was born in the first few months of the new millennium, Zuboff argues it was not until after the 9/11 terrorist attacks that it really started to take root.

At this point, elected officials, who only months before were discussing how to regulate the emerging internet sector, became very interested in allowing private companies such as Google to develop these still early surveillance capabilities.

“There were some comprehensive proposals for federal privacy legislation. All of that changed with 9/11, ‘total information awareness’ became the new obsession,” says Zuboff.

“The idea was we’ll let these surveillance capabilities develop in the young internet companies because we're going to need them and, at least in the United States, you can do things in a private company outside of constitutional constraints, whereas even our intelligence agencies ultimately are held to account by our Constitution and the rule of law, as much as they may try to play with that boundary.”

Zuboff dubs the shared interests of governments and private companies in these information-intensive surveillance capabilities an “elective affinity”. The development of this affinity was aided by the specific historical conditions of the dot com bubble and 9/11, which provided a “state of exception” for the new market form of surveillance capitalism to grow and evolve.

“Both institutions craved certainty and were determined to fulfil that craving in their respective domains at any price,” Zuboff writes in the book. “These elective affinities sustained surveillance exceptionalism and contributed to the fertile habitat in which the surveillance capitalism mutation would be nurtured to prosperity.”

Now, the “boon doggle” of Covid-19 represents a second state of exception for surveillance capitalists.

In mid-March for example, the UK’s prime minister, Boris Johnson, invited more than 30 technology companies – including Google, Facebook, Apple, Amazon, DeepMind and controversial surveillance-as-a-service firm Palantir – to Downing Street in an attempt to commandeer their resources in the fight against Covid-19.

Leading scientists in the UK also urged these companies to “invest in society” by sharing their data with government and researchers.

“Digital data from billions of mobile phones and footprints from web searches and social media remain largely inaccessible to researchers and governments. These data could support community surveillance, contact tracing, social mobilisation, health promotion, communication with the public and evaluation of public health interventions,” they wrote.

On 26 March, the BBC reported that Amazon, Microsoft, Palantir and Faculty (the artificial intelligence firm hired by Vote Leave) had already struck deals with the NHS to pull together its disparate data and the data its partners hold.

On top of this, a notice from health secretary Matt Hancock, which was signed on 20 March but has only become public since 1 April, provides legal backing for the NHS to set aside its duty of confidentiality in data-sharing arrangements.

Dubbed the Covid-19 Purpose, the new data-sharing agreement means NHS organisations and GPs can share any and all patient data with any organisation they like, so long as it’s for the purpose of fighting the coronavirus outbreak.

“We’re at an impasse now – we’re meeting this crisis at a time when we don’t have all the pieces in place that would allow us to trust such technology application and their complete dedication to public health objectives, because they remain in the unregulated, lawless space of private surveillance capital,” says Zuboff.

“We’re dependent on their word to say, ‘Of course we won’t collect more than is required for public health, of course we won’t use the data to identify people, of course we’re going to delete it when we’re past the pandemic’, so we’re back to self-regulation when we already know self-regulation doesn’t work – just in the same way Facebook bought WhatsApp and said [it would] keep it a separate company.”

According to research by digital security firm Surfshark, 60% of contact-tracing apps around the globe are unclear about what they track, do not provide terms and conditions upfront, or use intrusive methods, such as surveillance camera footage, to keep tabs on users.

Only four in ten were also identified as being developed by, or with the help of, non-governmental bodies or private companies.

Therefore, while surveillance capitalists have been busy expanding their supply chains, governments have been expanding their own surveillance apparatus too.

The expansion of state surveillance

For example, in the UK, the emergency Coronavirus Act relaxed restrictions on mass surveillance in the Investigatory Powers Act 2016 and expanded police detainment powers.

Of the UK’s 43 police forces, at least 26 have also set up dedicated online portals so people can report those breaking lockdown rules.

Although police claim they will only respond to the most serious incidents, and that the primary reason is to take pressure off the 101 non-emergency police number, the portals give these forces access to lots of new data which could be used for controversial “predictive policing” efforts that mainly target poor and racialised communities.

The UK is not alone, however, as governments and public bodies around the world have rushed to adopt new surveillance systems, ranging from dragnet monitoring systems and the collection of real-time location data, to the deployment of facial recognition and thermal cameras.

“In the coming decade, we need to create new democratic institutions, laws and regulatory paradigms that assert democratic governance over surveillance capitalism’s unaccountable power. This is the only way to ensure that data and AI serve society and democracy”
Shoshana Zuboff, author and professor

In France, for example, the Ministry of the Interior put a tender out for the acquisition of some 650 surveillance drones, which would effectively double its current fleet, while in Baltimore in the US, the police department has started testing an aerial photography system capable of tracking the city’s 600,000 inhabitants.

Similar efforts to either digitally or physically survey those suspected of infection or breaking government lockdown rules are in place all over the world, including in Spain, Australia, India, Belgium, Argentina, Russia and China, among many others.

“What we don’t have now is a sufficiently complex set of laws that would allow us to meet these challenges of the digital,” says Zuboff. “In the coming decade, we need to create new democratic institutions, laws and regulatory paradigms that assert democratic governance over surveillance capitalism’s unaccountable power. This is the only way we can ensure that data and artificial intelligence serve society and democracy.”

She adds that it is much easier to expand surveillance powers than it is to curtail them once made, pointing to the fact that US Congress has only just started to roll back elements of the Patriot Act, a sweeping piece of emergency anti-terrorism legislation rushed into law after 9/11 that expanded US surveillance powers, after 20 years.

“Anything that is proposed must come with its own self-destruct measures that are codified in law,” insists Zuboff.

Optimism over despair?

Despite the opportunity Covid-19 represents for surveillance capitalists and the risks posed by increased state surveillance, Zuboff maintains that nothing about the future is inevitable.

“It depends on us. It depends on every single person who reads what you write and the conversations they have within their communities. It depends on the new forms of collective action, social movements, it depends on demanding new laws, new institutions from our elected officials,” she says.

“That’s not going to happen overnight, but this is the decade in which we have to do it.”

Zuboff says a sea change has already occurred over the past few years regarding the business practices of surveillance capitalists, largely thanks to revelations such as those about Cambridge Analytica and other surveillance capitalists, which means we are now having conversations that five to ten years ago would have been impossible.

“In November of last year, Pew Research did a survey in the US, where 81% of Americans said that the cost of private companies’ data collection outweighed the benefits, so there are very significant ways in which we are no longer naïve – people are raising these questions and have substantial misgivings,” she says.

“That suggests we’re at a place in our history where it’s finally the time to invent the laws, the institutional forms, that are going to allow us to approach the digital future in a way that is compatible with democracy.”

Zuboff concludes that the fight against surveillance capitalism is a problem of collective action: “We need new social movements, we need new forms of social solidarity. Lawmakers need to feel our pressure at their backs.

“In the end, we’re not talking about bad people, we’re talking about an economic logic that is solely responsible for trillion-dollar market capitalisations. It has its iron laws. Now that we have begun to understand surveillance capitalism and its real threats to our future, shame on us if we don’t look at that, scrutinise it, and understand what it means for the future if we give them free run.”

Read more about privacy and surveillance

Read more on Privacy and data protection