Room 101 by @BloggersRUs

Room 101

by Tom Sullivan


Still image from 1984 (1984).

'Room 101,' said the officer.

The man's face, already very pale, turned a colour Winston would not have believed possible. It was definitely, unmistakably, a shade of green.


Capitalist acts between consenting adults have been taking place since before Hammurabi. But they were virtually all small-scale and interpersonal, committed in the souk, in the town square, or one-on-one. Only in the age of unregulated, global capitalism have those acts themselves become commodities for trade. The Code of Hammurabi and older collections of laws recognized the need for rules governing men, particularly in trade. Unregulated capitalism does not, nor does surveillance capitalism.

Harvard Business School professor Shoshana Zuboff discusses with Noah Kulwin the threat she describes in her book, "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power."

The collection of personal data via the Internet represents a “coup from above,” the next phase of unregulated capitalism turning people (you) into products. This is more than simply harvesting who your friends are or your taste in music or products. There is invisible metadata in what we do online that, when aggregated, has "tremendous predictive value that you don’t know you’re communicating when you are posting or when you’re searching or when you’re browsing."

How one structures or punctuates sentences, how one uses periods or ellipses or semicolons even, when processed through a five-factor personality model and combined with data from millions of others can generate "fine-grain predictions" about one's other behaviors, Zuboff believes:

This is happening with parallel processing with millions and millions of data points and ends up being able to predict, as we say with Cambridge Analytica, just you know pivoting these methodologies which are the sort of normal “day in the life” methodologies of any self-respecting surveillance capitalist, just pivoting those slightly from commercial outcomes to political outcomes, and you get really robust predictions about how people will react to certain kinds of material, triggers, stimuli, and so on.
Surveillance capitalism is an adaptation to current technologies and forms a kind of "information panopticon" in which every element of interactions online and with "smart" technologies at home or on your phone become product:
We’ve had mercantile capitalism, and we’ve had factory capitalism, and mass-production capitalism, and managerial capitalism, financial capitalism. And typically what happens in these new concepts is that modifier, like “mass production,” or in my case “surveillance” capitalism, what that modifier is doing is pinpointing the pivot of value creation in this new market form.
Surveillance capitalism is coming for your personal experience so it might "tune and herd populations toward guaranteed commercial outcomes," Zuboff warns:
It substitutes computation for politics, so it’s post-democracy. It substitutes populations for societies, statistics for citizens, and computation for politics, and so I read a lot about this experimental zone, the Facebook contagion experiments, where another experimental zone where when they wrote up those experiments in scholarly journals — very smart data researchers from Facebook combined with very smart academic scholars boasted that now we know that we can use the online world to create contagion that changes behavior in the real world. The first case it was voting, the second case it was emotional. And they bragged in their articles that we can do this in a way that bypasses the awareness of the user. Right? Always engineered for ignorance. Because you know, that’s the surveillance essence of this, you can’t do this by asking permission. You can only do this by taking it in a way that is secret, that is hidden, that is backstage.
So, a more digital form of disaster capitalism in which democracy is mere window dressing for furthering the commercial desires of financial interests. Computational governance becomes a new form of absolutism. Pivoting the ends of this kind of technology from commercial ends to political ones is a mere flip of a switch.

Zuboff believes there is still political maneuvering space for stopping it, but that will take a doubling down on democracy. It has advanced as far as it has because it is near-invisible. "Every single piece of research, going all the way back to the early 2000s, shows that whenever you expose people to what’s really going on behind the scenes with surveillance capitalism, they don’t want anything to do [with] it," she says. But as with other forms of metastasized capitalism, people right now feel they have no choice. Capitalism is all about expanding our choices, remember?

This makes surveillance capitalism "a colossal market failure. Because it is not giving people what people want. It’s giving business customers what they want, to be able to manipulate people, but it’s not giving actual populations of people what we want."

That has some rather serious political implications, as is already evident in China. But this is instumentarian rather than totalitarian power, Zuboff asserts. It doesn't want to harm you, but make you better serve commercial interests. Combine that with an authoritarian state such as China, however? Zuboff warns, "[A]nd what we saw in America already is that anybody with enough money, any ambitious plutocrat, can buy the skills and the data to use these same methodologies to influence political outcomes."

I don't recall how the Ministry of Truth knew Winston Smith had an existential fear of rats. But in 2019, Robert Mercer or someone like him could probably sell them that information.