Surveillance Capitalism: The Abuse of Data and the Human Experience
By Anonymous / Winter 2020
In 2018, one of the biggest political scandals in recent history unfolded. It was revealed that Cambridge Analytica, a data analytics firm, had harvested the personal data of millions of Facebook users without consent – and that the election of Donald Trump had been supported by this illegal move. It’s scary to think that our most prominent political office can be affected and manipulated by data collection and analytics, but sadly, this is the new reality we live in. Prior to 2018, few people, if any, knew the extent to which data science is being used to reshape our reality and our perception of the world. To quote Christopher Wylie, a whistleblower from Analytica, “We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis that the entire company was built on” (Reuters, 2018). When the “leadership of the free world” can be manipulated and influenced by companies so covertly, something is wrong. If data analytics have become so powerful, it is critical that we understand their role in society – and perhaps even limit that role.
Data is an essentially unlimited resource. We generate it through our every action – every click, every purchase, every walk through the camera-laden city. And through this vast and untamed resource field, companies can extract insights into people, their actions, their desires. On the surface, this may just sound like another useful marketing tool, the savvy use of another strategy to target consumers. But when viewed through a critical lens, we see a deeper problem at hand. “Surveillance capitalism claims private human experience for the market dynamic, and it repurposes private human experience as a free source of raw material for production and sales.” (Zuboff, 2019). It sounds eerily similar to how capitalism and industrialism views the environment, and we know how badly the environment has been affected by this viewpoint. Can we expect the same thing to happen to our privacy, if our personal information and data has also been turned into a commodity? In Day 16’s lecture, we went over capitalism’s relationship with nature – one where nature is a machine, to be abused for the sake of extracting resources and maximizing production. Similarly, we are now being treated as machines, where our data is being extracted for the sake of maximizing companies’ ability to manipulate and influence consumers. When data is viewed as an unlimited resource, one that we are constantly generating, there is no downside to collecting every tiny datastream that a person generates, so long as it works towards the “greater good” of profit. But to the people whose data is being used, their human experience is not just a limitless resource for companies to exploit. It is the sum-total of their lives, memories made and forgotten that compose them as a person. Unfortunately, such nuance and humanity is lost on a data analytics algorithm spitting out the ideal advertisement to influence your vote.
These violations of privacy through data collection are only the first piece of the puzzle. If we consider data-analytics as a 3-step process: collection, analysis, and application, we have only discussed the dangers of collecting data without regard to privacy. But what about the analysis phase? As a researcher in machine learning, I can speak with some confidence on what techniques are being used for data analytics. But rather than focus on the technical brilliance of the various algorithms that have been developed, I will discuss the most dangerous part of data analytics – and one of the greatest flaws of surveillance capitalism. Though companies go to great lengths to try and collect massive data repositories, often through unethical or even illegal means, that data tends to be very unstructured or incomprehensible. Even to data scientists, the form or nature of the data is often not important. Our algorithms handle that for us – and therein lies the problem. The algorithms used for data analytics are “black boxes”. In other words, we plug data in, and get a result out – but don’t know why that result was selected. There is no way to tell why a complex machine learning model predicted something. So when we use a machine learning model to predict recidivism rates, to decide whether someone will go back to jail after release – the model could be biased, unfair, or broken, and we simply wouldn’t know until it had gotten it wrong enough. Unfortunately, in this case, that could mean the difference between a life sentence and a slap on the wrist (Turgut, 2017). Not only is surveillance capitalism violating our privacy and the sanctity of our personal experience, it is also utterly unassailable in its current form. To question the accuracy and usefulness of data analytics is to question the supreme power of machine learning, technology, and progress – and thus we shut down discussion and criticism of the topic. Their hegemony is unquestioned, for surveillance capitalism has wormed its way into our daily life to such a point that it is the norm, undisputable and utterly unimpeachable. It is the best and only way for companies to view people and their data.
Assume that we accept this hegemony, accept invasive data collection, and we truly trust our data analytics algorithms to produce accurate results 100% of the time. The question now becomes, do we trust companies, governments, and other data science practitioners to use their results for good? Let us see where data analytics and surveillance is being used in the real world. Over the last several years, multiple studies have been published on using machine learning algorithms to predict a person’s sexual orientation based on their face (Leuner, 2018). The accuracy of the results is troubling. Without consent, algorithms can determine one of the most private and secret parts of a person – without that person ever even knowing they were analyzed. And those results can often be devastating. “The predictability of sexual orientation could have serious and even life threatening implications to gay men and women and the society as a whole. In some cultures, gay men and women still suffer physical and psychological abuse at the hands of governments, neighbors, and even their own families” (Wang 2018). The danger of such algorithms can be further demonstrated by China’s Xinjiang province. In China’s only Muslim-majority region, the government and local authorities are leveraging data analytics alongside their vast surveillance networks to crack down on minorities and “re-educate” Uighur Muslims. The stories coming out of the region are truly horrifying. People are analyzed, often under the guise of health checks, only to be carted off to detention centers as suspected terrorists.
“The police administered what they call a “health check”, which involved collecting several types of biometric data, including DNA, blood type, fingerprints, voice recordings and face scans – a process that all adults in the Uighur autonomous region of Xinjiang, in north-west China, are expected to undergo.
After his “health check”, Alim was transported to one of the hundreds of detention centres that dot north-west China. These centres have become an important part of what Xi Jinping’s government calls the “people’s war on terror”, a campaign launched in 2014, which focuses on Xinjiang, a region with a population of roughly 25 million people, just under half of whom are Uighur Muslims” (Byler, 2019).
With data analytics, the government turns this “hunt” for “terrorists” into an exact science. Facial recognition technologies are proving to be particularly useful for them.
“Already, China is exploring using facial recognition technology to sort people by ethnicity. It is also researching how to use DNA to tell if a person is a Uighur. Research on the genetics behind the faces of Tumxuk’s men could help bridge the two.
The Chinese government is building “essentially technologies used for hunting people,” said Mark Munsterhjelm, an assistant professor at the University of Windsor in Ontario who tracks Chinese interest in the technology” (Wee, 2019).
But of course, surely there is great outrage about facial recognition and data analytics being used for these things? Or perhaps it has been subsumed into the great umbrella of surveillance capitalism, an acceptable and expected method of data collection and usage – as one might expect when people are seen as resources rather than humans? There has been a rather notable lack of backlash on this subject – and even support in the research community abroad. European research institutes provide funding to the technology groups most closely involved on the project, and some academics are attempting to call out the complicity of the world of science, as it looks on, complacent (Wee, 2019).
But the truth is, under the current structure, researchers and academics can hardly be blamed. We live in a world under surveillance capitalism. In this world, humans are unlimited sources of data to be gathered in any way possible. Technology and science go unquestioned in their dominance – much like the hegemonic institutions we learned about in lecture. To apply these techniques in horrific and abusive ways is acceptable and expected – after all, how could we possibly question surveillance capitalism, even the parts that harm people? The situation can seem bleak. There was once a more hopeful path available to us, before surveillance capitalism became so dominant. In a podcast from the Harvard Business Review, Shoshana Zuboff outlines the more hopeful origins of surveillance capitalism, something she calls “digital capitalism”. In her 2001 book, she goes over the nature of digital capitalism – how it could be used to support people, rather than use them. So many of society’s problems can be solved with data-based approaches. “We need data to solve the climate crisis. We need data to solve chronic puzzles of health and well-being. What’s happened here, is that digital technology has been hijacked. The technology for which we harbored so much hope for empowerment, emancipation, democratization of knowledge – this entire technology surround has been hijacked by a self-serving economic logic.” (Zuboff, 2019). Most of today’s data scientists work for the leading surveillance capitalists – who will inevitably feed further into this hegemonic system, rather than applying the power of data science to solve the problems that it can. They feed the ‘second gilded age’, not one of industrial capitalists, fed by cheap labor, but one of surveillance capitalists, fed by cheap data. By “owning” the data of their consumers, they force the rest of back into a “dark age” of knowledge, where the only source of insight and analytics are the surveillance capitalists, who use this power to only further their own position.
Perhaps we can try and reclaim that hopeful path, outlined in Zuboff’s 2001 book. Perhaps we can eliminate the corruption of data science by self-serving economic logic, and re-apply it to support human interests as a whole? The solution lies in the recurring thread throughout the previous examples. In all of the negative cases, all of the problems with surveillance capitalism, people are treated as resources – not as humans. If we flip that idea – that people are humans, and not resources, we come to a clear conclusion. The human experience should be untouched by data analytics, because it should not be a resource to be exploited. There are classes of data generated by humans that should not be available to companies to exploit. “To make an omelette, you must break some eggs”, said Milton Friedman. Well, some eggs (data) should not be breakable (accessible). Private and government use of data analytics to predict human behavior must be regulated – else society is easily corrupted and manipulated. Through the democratic tradition – by giving the citizenry control – people can vote to decide what data should be protected, what use of data analytics is acceptable and useful to society. Through the democratic tradition, people can take back control of their data, rather than allowing surveillance capitalists to dominate and “own” their experience. People ought to be able to consent to the use of their data freely if they feel it benefits them. Through the keys of democracy and consent, we can unlock a better path for society, and steer away from the hegemony of surveillance capitalism.
Surveillance capitalism is one of the greatest threats facing us today. The power of data science in manipulating and controlling the behaviors of people is truly terrifying. It is difficult to move away from this paradigm, as we are so immersed in it, as is the case with every hegemony – but it is possible. First, we must identify the new threats and proposals designed to advance and protect this hegemony. Recently, there have been proposals to use camera-based drones, combined with facial-recognition technologies and data analytics, to monitor people’s activities during the coronavirus shutdown (Heilweil, 2020). Through legislation like this, proposals to hand yet more power and trust is handed to the companies running these systems, more power is taken away from the people. Every time our personal data is used without consent, we give up some of our own human experience, and allow it to be exploited by groups whose aims may not align with our own. In today’s world, we take it as a given that people should have control over their own persons, that their humanity cannot be bought and sold as a commodity. And yet we allow the collection, analysis, and commoditization of the data that composes our human experience. Such a contradiction cannot be allowed to exist. But through the democratic tradition, through handing back control to the people, we can ensure that our data is only used in ways that benefit us – ways that we approve of and allow. Surveillance capitalism is the hegemonic power today, a power that seems utterly unstoppable. But if we want to follow the words of the founding fathers of the U.S. – “Governments are instituted among men, deriving their just powers from the consent of the governed” – we must remember that our consent, our democratic say, is critical to the prevention of tyranny and the protection of our rights. If we are to stop the spread of surveillance capitalism’s influence, its abuse and exploitation of the human experience, we must take back control of our data.
Works Cited
Ingram, David, and Peter Henderson. “Trump Consultants Harvested Data from 50 Million Facebook Users: Reports.” Reuters, 16 Mar. 2018.
Zuboff, Shoshana. “Surveillance Capitalism.” Technology, Harvard Business Review, 19 June, 2019.
Ozkan, Turgut. “Predicting Recidivism through Machine Learning.” Thesis, 2017.
Leuner, John. “A Replication Study: Machine Learning Models Are Capable of Predicting Sexual Orientation From Facial Images.” University of Pretoria, 2018.
Wang, Yilun, and Michal Kosinski. “Deep Neural Networks Are More Accurate than Humans at Detecting Sexual Orientation from Facial Images.” Stanford University, Journal of Personality and Social Psychology, 2017.
Byler, Darren. “China’s Hi-Tech War on Its Muslim Minority.” The Guardian, 11 Apr. 2019.
Wee, Sui-Lee. “China Uses DNA to Map Faces, With Help From the West.” New York Times, 3 Dec. 2019.
Heilweil, Rebecca. “Coronavirus Is the First Big Test for Futuristic Tech That Can Prevent Pandemics.” Vox, 27 Feb. 2020.
Class Materials:
Twohig. Lecture 16: Capitalism and Nature
Twohig. Lecture 6: Dominant ideologies and the Second Gilded Age
The One Percent: Capitalism/Industrialism – Milton Friedman Interview