Surveillance Capitalism versus Democracy
Nicola Vallinoto
Computer scientist and international democracy activist
Shoshana Zuboff
The Age of Surveillance Capitalism: the Fight for a Human Future at the New Frontier of Power, Profile Books, London, 2019
Coronavirus is the new global crisis the Surveillance Capitalism was waiting for. After the Twin Towers terrorist attack in 2001, the pandemic is another great step towards a more extensive control of our lives. Let’s look at China and South Korea with their tracking apps to check Covid-19’s presence and people's proximity. Other countries will apply this method asking their citizens to download the app in their smartphones to face the second phase of the pandemic. We, as citizens, are giving voluntary access to our data to protect ourselves from Covid-19, but who will defend us from abuses and who guarantees that no one will use our data for different aims after the pandemic will be finished? And this is only the beginning: the former Google CEO, Eric Schmidt, imagines a post-Covid reality with a permanently invasive technology into every aspect of civic life.
The Age of Surveillance Capitalism is the last impressive book written by the American sociologist Shoshana Zuboff just before the coronavirus arrived. Let’s go back to the beginning of this century and to Google, the initiator of the so called “surveillance capitalism”, as explained by the author. During an early period of Google life, behavioral data were put to work entirely to the user’s advantage. User data provided value at no cost, and that value was reinvested in the user experience in the form of improved services: enhancements that were also offered at no cost to users. In April 2000, the venture capitalists changed their mind about Google's ability to raise money, but then the legendary dot-com economy began its steep plunge into recession, and Silicon Valley’s Garden of Eden unexpectedly became the epicenter of a financial earthquake. At Google, in late 2000, it became a reason for annulling the reciprocal relationship between Google and its users, pushing the founders to abandon their passionate and public opposition to advertising.
After the 2001 terrorist attack, Google’s declared state of exception was the backdrop for 2002, the watershed year during which surveillance capitalism took root. In his book Surveillance After September 11, surveillance scholar David Lyon writes that in the aftermath of the attacks that day, existing surveillance practices were intensified and previous limits were lifted. In that environment of trauma and anxiety, a “state of exception” was invoked to legitimate a new imperative: speed at any cost. The suspension of normal conditions was justified with reference to the “war on terror”. Critical to our story is the fact that this state of exception favored Google’s growth and the successful elaboration of its surveillance-based logic of accumulation.
The elective affinity between public intelligence agencies and the fledgling capitalist Google blossomed in the heat of the emergency to produce a unique historical deformity: surveillance exceptionalism. It appears that one of the unanticipated consequences of this public-private “elective affinity” was that the fledgling practices of surveillance capitalism were allowed to root and grow with little regulatory or legislative control, emboldening Google’s young leaders to insist on lawlessness as a natural right and, in ways that are even more opaque, emboldening the state to grant them that freedom.
With Google’s unique access to behavioral data, it would now be possible to know what a particular individual in a particular time and place was thinking, feeling and doing. User Profile Information (UPI) may be inferred, preserved and deduced. UPI may be provided by the user himself, by a third party authorized to release user information, and/or derived from user actions. Behavioral data became raw material for the construction of a dynamic online advertising market place.
After 2001, Google started to operate in obscurity, indifferent to social norms or individuals' claims to protect their own decision rights. These moves established the foundational mechanisms of surveillance capitalism.
The summary of these developments is that the behavioral surplus upon which Google’s fortune rests can be considered as surveillance assets. These assets are critical raw materials in the pursuit of surveillance revenues, and their translation into surveillance capital. The entire logic of this capital accumulation is most accurately understood as surveillance capitalism, which is the foundational framework for a surveillance-based economic order: a surveillance economy.
The big pattern here is one of subordination and hierarchy, in which earlier reciprocities between the firm and its users are subordinated to the derivative project of our behavioral surplus captured for others’ aims. We are no longer the subjects of value realization. Nor are we, as some have insisted, the “product” of Google’s sales. Instead, we are the objects from which raw materials are extracted and expropriated for Google’s prediction factories. Predictions about our behavior are Google’s products, and they are sold to its actual customers, but not to us. We are the means to others’ ends.
Machine intelligence processes behavioral surplus into prediction products designed to forecast what we will feel, think, and do: now, soon and later. Prediction products are sold into a new kind of market that trades exclusively in future behavior. Surveillance capitalism’s profits derive primarily from these behavioral-future markets.
The strategy of Google as stated by its CEO Eric Schmidt in 2008 was: “The goal of the company is customer satisfaction. You should think of Google as one product: customer satisfaction. Those customers are the world’s advertisers and others who pay for its predictions.” In 2010 he observed: “You give us more information about you, about your friends, and we can improve the quality of your searches. We don’t need you to type at all. We know where you are. We know where you’ve been. We can more or less know what you’re thinking about ”.
During the years, fortifications have been erected in four keys arenas to protect Google, and other surveillance capitalists, from political interference and critique: 1) the demonstration of Google’s unique capabilities as a source of competitive advantage in electoral politics, 2) a deliberate blurring of public and private interests through relationships and aggressive lobbying activities, 3) a revolving door of personnel who migrated between Google and the US administration, and 4) Google’s intentional campaign of influence over academic work and the larger cultural conversation so vital to policy formation, public opinion, and political perception.
Surveillance capitalism must be reckoned as a profoundly antidemocratic social force. In this context, Facebook’s Mark Zuckerberg offered his social network as the solution to the third modernity. He envisions a totalizing instrumentarian order – he calls it the new global “church” – that will connect the world’s people to “something greater than ourselves”. It will be Facebook, he says, that will address problems that are civilizational in scale and in scope, building the long-term infrastructure that will bring humanity together.
Larry Page, cofounder of Google, defended Google’s unprecedented information power with an extraordinary statement, suggesting that people should trust Google more than democratic institutions. In general, having the data kept in companies like Google is better than having them in the government, with no due process to get to the data, because 'we obviously care about our reputation, I’m not sure the government cares about that as much'.
Google, Facebook and surveillance capitalism in general have a common enemy: democratic institutions. As Zuboff reveals, the European Union is the main one. “The human need for a space of inviolable refuge - the right to a sanctuary- has persisted in civilized societies from ancient times, but is now under attack as surveillance capitalism creates a world of “no exit”, with profound implications for the human future at this new frontier of power”. The Court of Justice of the EU announced its decision to assert “the right to be forgotten” as a fundamental principle of EU law in May 2014 (Mario Costeja Gonzalez vs. Google). The Luxembourg Court felt that free flow of information matters, but not as much as the safeguard of dignity, privacy and data protection in the European rights regime. The Court conferred upon the EU citizens the right to combat, ordering Google to establish a process for implementing users’ delinking requests, and authorizing citizens to seek recourse in democratic institutions. The European Court of Justice demonstrated the unbearable lightness of the inevitable, as it declared what is at stake for a human future, beginning with the primacy of democratic institutions in shaping a healthy and just digital future.
Conclusions
If the digital future is to be our home, then it is us who must make it so. We will need to decide. We will need to decide who decides. This is our fight for a human future. Physical places, including homes, are increasingly saturated with informational violations as our lives are rendered as 'behavior', and expropriated as 'surplus'. Other violations are simply imposed upon us, as in the case of the “talking dolls”, the listening TV, the hundreds of apps programmed for secret renditions, and so on. If billions of sensors capturing personal data fall outside of the US Fourth Amendment protections, a large-scale surveillance network will exists without constitutional limits.
Many hopes today are pinned on the new body of EU regulation known as the General Data Protection Regulation which became enforceable in May 2018. The EU approach fundamentally differs from that of the US in that companies must justify their data activities within the GDPR’s regulatory framework. It allows for class-action lawsuits in which users can join together to assert their rights to privacy and data protection. The only possible answer is that everything will depend upon how European societies interpret the new regulatory regime in legislation and in the courts. It will not be the wording of the regulations, but rather the popular movements on the ground that shape these interpretations. We need synthetic declarations that are institutionalized in new centers of democratic power, expertise, and legal battles that challenge today’s asymmetries of knowledge and power.
We are living in a moment when surveillance capitalism and its instrumental power appear to be invincible. It is up to us to use our knowledge, to regain our bearings, to stir others to do the same and to found a new beginning. The future of this narrative will depend upon indignant citizens, journalists, and scholars drawn to this frontier project; indignant elected officials and policy makers who understand that their authority originated in the foundational values of democratic communities. The Berlin wall fell for many reasons, but above all it was because the people of East Berlin said, “No more!” We too can be the authors – affirms Zuboff - of many “great and beautiful” new facts that reclaim the digital future as humanity’s home. No more! Let this be our declaration.
These are the conclusions of Zuboff’s research and the beginning of a new story where democratic institutions, starting from the European Union, together with more indignant citizens, can counterbalance the unprecedented power concentrated in the hands of very few web companies. We have to decide and decide who decides. To do that, we have to take in our hands our digital future and regain our digital sovereignty.