In the technologically developed world, nearly every person has a presence on the internet, and this presence is most commonly mediated by social media platforms such as Facebook and Twitter. Consumer data scraping for the purposes of marketing has been a staple in the social media industry since Facebook rose to the top of the social media corporation ladder, and the framework required to make targeted advertising work, as well as the tools used in the process, open up a bevy of new issues and threats to people’s way of life on a global scale. This essay represents an exegetical effort whose goal is to elucidate the manner in which data scraping, algorithmic data set classification strategies, and surveillance capital’s regime of certainty depict an important cog within a global pacification project. Furthermore, this essay aims to highlight the ways in which the combined effects of these phenomena threaten our freedom to be wrong and herald an impending regime of absolute certainty which is fueled by a fanatical devotion to economic efficiency.

It is well known that at the heart of all forms of capitalism lies a ‘grow or die’ incentive – this very phrase has become essential to almost every business model, and this is reflected quite publicly by business owners and high ranking executives everywhere. Take for example Mark Schellinger, Co-founder and Director of Business Development at 123 Home Care, a leading non-medical home care company: in an article he wrote for Forbes, Mr. Schellinger said, “Growth is a framework that needs to drive all operational tasks, projects and initiatives of a company. The DNA of the company needs to be growth. Opportunities for growth are endless and should always be seized.”[1] The centrality of growth to capital is well known, but the definition of growth as well as the implementations considered necessary to stimulate it, have changed over time. In the mid-19th century, at the peak of industrial capitalism, Karl Marx explained the growth function of the capital system. Money is the main ingredient in the accumulation function, which Marx expresses as M – C – M’. When a commodity (C) is purchased using money (M), it is purchased with intent to sell for a higher price (M’), thereby creating what Marx refers to as “surplus value”, which is primarily produced by imbuing the commodity with some form of labour before reselling it. Marx claims that this version of money, which is to say money in motion, deployed with intent to create surplus value (or as a more modern political economist would put it, invested), is capital. Furthermore, Marx claims that surplus value is not an end in and of itself, nor does it satisfy any particular need; its primary purpose is to produce more surplus value, which means that the movement of the capital and money which creates it is theoretically endless. So long as there is a good which can be bought, imbued with labour, and sold once more for a surplus, “The movement of capital is…limitless.”[2]

The aforementioned conditions were necessary at the height of industrial capitalism when Marx published his treatise, but capitalism has changed significantly since then. Marxism remains a useful lens for critiquing capitalism in the 21st century, but some of the system’s new tenets, specifically its manipulation of new technologies, require a more precise critique. New technology allows capital accumulation to be exponentially more efficient. A basic example is an advanced robot that replaces a human in assembling factory parts. The robot will be able to work faster, more consistently and more efficiently than any human, making good production more cost-efficient. Additionally, the robot is a more cost-efficient labourer, because robots cannot currently strike or demand higher wages. This drive for cost-efficiency has taken hold across nearly every industry, as the more cost-efficient a firm can be, the more goods and the better services it can produce. This is especially relevant in the techno-medical sector today, where health economists are looking to alter technologies based on health services needs in order to reduce the cost of patient care. Health economist Sarah Thomas recommends “robotic process automation (RPA) to streamline administrative work and tedious back-office tasks” and the usage of “real-time [data] analytics to manage [the] workforce…checking labor statistics every two hours, and staffing its workforce based on acuity and productivity targets.”[3] Cost-efficiency is a primary concern in the healthcare industry. This obsession with technological efficiency is also ever-present in the marketing industry, as I will elucidate further.

Devotion to efficiency and its reification is rarely critiqued in the 21st century, likely as a result of the fact that efficiency directly stimulates growth. In the mid-20th century, however, French sociologist and theologian Jacques Ellul provided a resounding and timeless critique of technology, or as he put it, the phenomenon of technique. He defines technique as “the totality of methods rationally arrived at and having absolute efficiency (for a given stage of development) in every field of human activity.”[4] Therefore, technology is simply a methodology defined by its absolute efficiency and rationality, and its pervasiveness throughout human life. Contemporary ethicist and theologian Darrell Fasching explains Ellul’s theory as one saying that efficiency becomes a necessity in the technological society, and eventually its domination becomes absolute — as Ellul says, “…the infusion of some more or less vague sentiment of human welfare cannot alter it. Not even the moral conversion of the technicians could make a difference.” Instead, technique tends to “create a completely independent technical morality.”[5] Thus, technology represents an overwhelming force by which both civilization and its understanding of the world are changed.

The algorithm is a perfect example of the efficiency-driven movement of technical development. One of its most important developers, Alan Turing, claimed the purpose of the algorithm was to replicate the complex operations of the human brain during its calculative process, while also increasing the consistency and accuracy of those calculations. Turing’s goal was to increase the efficiency of calculations by eliminating the human factor, making efficiency the algorithm’s only goal. It has subsequently seen countless applications since its inception, of which the most important to this discussion is targeted marketing and online advertising. This fact is reflected in a report from the Office of the Privacy Commissioner of Canada, which states the following:

Behavioural advertisers often use sophisticated algorithms to analyze web histories, build detailed personal profiles of users, and assign them to various interest categories. Interest categories are then used to present ads thought to be relevant to users in those categories. Ads can also be targeted based on specific websites that users have visited recently (often called retargeting or remarketing).[6]

So beyond what algorithms do, how they work? Though this information is not closely guarded, it is nearly impenetrable to those outside of the computer science discipline. An example of a very basic algorithm is a rule-based system. These algorithms use man made rules and if-then-else statements to make a determination or produce an outcome. For example, IF the employee does not sell 500 units by the end of the fiscal period, THEN reduce employee’s wage by 3%, or ELSE pay the employee in full. These are useful algorithms but require human influence to set up and lack flexibility: for example, if the employee had been pregnant and on maternity leave, this algorithm would wrongfully cut their wages for falling under the target. Today, most algorithmic marketing is done using some combination of inference engines and hierarchical learning. The former refers to a system that inputs rules automatically by classifying, interpreting and evaluating raw data, thereby turning it into meaningful data. An inference engine would be able to look at 10 year’s worth of umbrella sales in Ontario, and recognize that more umbrellas are purchased in March and April than the rest of the months, thereby deducing the rule that umbrellas are in higher demand during the months of March and April. This is an example of an inference engine “forward chaining”, as it is deducing a future outcome based on known facts that it generates through its own ability to infer.[7] Hierarchical learning is a methodology by which different tasks are assigned different varieties of learning which are required to conduct them. For example, handwriting recognition and speech recognition require different learning methods to conduct successfully.[8] Applying this to inference systems improves their ability to make judgements and formulate rules – the result is techniques such as document classification, image classification, sentiment analysis, product design and financial management.[9]

These algorithm-based data analysis technologies are extremely effective and accurate not only at targeting but at predicting these tendencies as well. This is useful in the world of marketing and predictive analytics: firms can now predict what people will want, allowing them to devise strategic social media strategies. For example, information on products likely to be purchased is presented upfront, making purchase easier and more likely. Charles Duhigg highlighted this in an article on mega retailer Target’s marketing strategy that targeted pregnant women, with the goal of “finding out if a customer is pregnant, even if she didn’t want us to know…” According to statistics expert Andrew Pole, who was tasked with this project, advertisers had realized that marketing products to pregnant women is an issue of timing – when birth records become public, couples are often bombarded with thousands of offers and advertisements from countless marketers all at once, which makes it easy for one firm to get drowned out among the rest. So, the advertisers wanted to find out how they could target women in their second trimester, so as to increase the cost-efficiency of their marketing efforts.[10] By assigning each shopper what is referred to as a Guest ID number, information on that shopper can be recorded and analyzed using a combination of technologies from the company’s predictive analytics department, and strategies developed in the fields of neurology and psychology.[11] This way, advertisers know more about what consumers want before consumers even have the chance to tell them explicitly. Surveillance capitalism specialist Shoshanna Zuboff, in an interview with Google’s chief economist Hal Varian, uncovered the final goal of predictive analytics:

Varian claims that ‘nowadays, people have come to expect personalized search results and ads.’ He says that Google wants to do even more. Instead of having to ask Google questions, it should ‘know what you want and tell you before you ask the question.’ ‘That vision,’ he asserts, ‘has now been realized by Google Now…[12]

Therefore, though consumers may not understand why or how Google knows what they want at any given time, the consumers’ way of life becomes so ingrained with this phenomenon that they cease to even question it. Google increases the cost-efficiency of its marketing techniques, and consumers enjoy even more convenience.

Perhaps one of the most ubiquitous and universal activities in the modern human experience is not reading terms of service agreements. A Deloitte study in 2017 found that 91% of people click ‘I Agree’ without reading a word about what they are agreeing to.[13] There are various issues of informed consent here, and a clear and urgent need for improved regulation, however for the purposes of this essay the service itself is of primary interest. Zuboff, with the help of her interlocutor Varian, explains that the root of this desire is convenience. According to Varian, “There is no putting the genie back in the bottle…Everyone will expect to be tracked and monitored, since the advantages, in terms of convenience, safety, and services, will be so great…continuous monitoring will be the norm.”[14] Varian goes on to argue that people are willing to share their information because they get something in return.”[15] Convenience is a powerful phenomenon in a world dominated by cost-efficiency focused corporations: as exemplified by the Canada Post Lock Out case, workers are expected to work hard and be expendable. If those workers order through Uber Eats it is highly unlikely that they care what the terms of service agreement have to say or what is happening to their data. As Varian suggests, once this happens on a large scale, technologies that can provide convenience while scraping away valuable consumer data start to become normal, and even expected. If a consumer wants relevant ads because they do not want to have to do any research for their online purchases, they must submit to the prying eyes of advertisers around the globe, and in many cases other types of firms as well.

Zuboff highlights that this development represents a redistribution of privacy rights, because those who occupy the managerial positions of these big data and social media firms enjoy extensive privacy rights. Very few individuals care to uncover the mysteries of these firms and their methodologies. In the meantime, consumers are deprived of their choice in the matter. Some of the consumer’s information may be private, and some may have been harvested by one of these mega firms, but ultimately that decision is not up to the consumer.[16] Most people regard these technologies as a need that is “essential for basic social participation.”[17] This “extractive operation” is driven by and rooted in Ellul’s notion of sanctification, but also the pervasiveness of technique in modern society.

The extent to which technique has become a need for people in the technologically developed world, and the mysterious magic-adjacent operations of these advanced techniques render populations worldwide passive and submissive to the extractive efforts of advertisers, communications consultancies and authoritarian governments, among many other organizations. The drive for efficiency and growth has made technology a crutch, which global society is totally dependent on. This has created an attitude among individuals within technologically developed countries that they must offer up their personal information in order to operate socially. Regulatory efforts by states all over the world are required in order to redistribute privacy rights effectively and justly, and thereby curb the growing power of the private sector, which often wields, gathers, packages and sells consumer data. To protect global society, we must resist the drive to be efficient at the expense of all else, and promote legislation which justly redistributes the privacy of individuals everywhere, else our humanity becomes entirely a commodity.


Jonah Somers is a graduate student in Political Economy at Carleton University where he completes research on surveillance technologies, commodification and privacy.



[1]      Mark Schellinger, “In Business, You’re Either Growing Or You’re Dying”, last modified March 23, 2018,
[2]      Karl Marx, “Capital” in Karl Marx Selected Works, ed. Lawrence H. Simon (Indianapolis: Hackett Publishing Company Inc., 1994), 259-262.
[3]      Sarah Thomas, “From Costly to Cost-effective: How Technology Can Help Hospitals’ Bottom Line,” last modified November 7, 2017,
[4]      Jacques Ellul, The Technological Society (New York: Random House, 1904), xxv.
[5]      Ellul, The Technological Society, 97.
[6]      Technology Analysis Branch of the Office of the Privacy Commissioner of Canada, “Online Behavioural Advertising (OBA) Follow Up Research Project,” June 2015, 1.
[7]      A.K. Pradeep, Andrew Appel and Stan Sthanunathan, AI for Marketing and Product Innovation Powerful New Tools for Predicting Trends, Connecting with Customers and Closing Sales (New Jersey: John Wiley & Sons, 2019), 10.
[8]     Pradeep et al., AI for Marketing and Product Innovation Powerful New Tools for Predicting Trends,12-13.
[9]     Ibid., 11, 13.
[10]    Charles Duhigg, “How Companies Learn Your Secrets,” last modified February 16, 2012,
[11]     Duhigg, “How Companies Learn Your Secrets.”
[12]     Shoshanna Zuboff, “Big Other: surveillance capitalism and the prospects of an information civilization,” in Journal of Information Technology 30, no.1 (March 2015): 83.
[13]     Caroline Cakebread, “You’re not alone, no one reads terms of service agreements,” last modified November 15, 2017,
[14]      Shoshanna Zuboff, “Big Other: surveillance capitalism and the prospects of an information civilization,” in Journal of Information Technology 30, no.1 (March 2015): 82.
[15]      Zuboff, “Big Other,” 83.
[16]      Ibid., 82.
[17]      Ibid., 83.


Banner image by Lianhao Qu, courtesy of Unsplash.

You May Also Like