The Growing Knowledge Imbalance
Knowledge inequality has shifted from text to data, and we are overmatched
One sign of the dismal turn the Internet has taken is how society has become less scientifically literate, more confused, and less informed on multiple levels. It has also become more unequal, more authoritarian, and more elitist.
This is precisely what was not supposed to happen.
The dominant forces driving these changes haven’t been the traditional ones — governments, universities, and citizens. Rather, the future is being shaped by commercial surveillance and its side-effects, as outlined in a recent interview with Shoshana Zuboff on the “Recode/Decode” podcast.
Zuboff is the author of a new book entitled, “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.” Zuboff is alarmed by what she’s seen over her long career analyzing changes in capitalism’s focus and behavior. What started decades ago as organizations and cultures being stripped of muscle and sinew in order to pump up shareholder value has become an entire century’s framework being set by major new economic theory — the theory that things are free if you’re willing to give up a lot of rights and information in ways that are concealed from you:
When we talk about surveillance capitalism, just as industrial capitalism gave us the culture and the quality and the moral milieu of our industrial society and our industrial civilization, right now surveillance capitalism dominates, and if we don’t stop it, it’s going to define the moral milieu and the culture and the nature of 21st century society.
These are not new threads, but Zuboff puts them together in a skein that grabbed my attention.
Surveillance capitalism is accomplished quietly. Secrecy and surreptitiousness are not normal or ethical approaches to observing human subjects, especially on a large scale. But that’s how it’s being realized.
Yet, because its purveyors have set the table of expectations so that information is expected to be free, and so that Silicon Valley entrepreneurs are viewed as the good guys, we have organizations like Cold Spring Harbor Labs and bioRxiv accepting money laundered from Facebook into CZI to fund free information initiatives. This, despite behavior that in a prior decade seems likely to have led to protests, campus activists, and investigations into corrupt business practices.
Zuboff underscores how nefarious the data acquisition has been. She notes there was really no other way for these organizations to do it. After all, if the head of a large company came up to you and asked you to share your contact list, your friends’ names, and a list of web sites you’ve visited in the past six weeks, you’d tell that person to go away (or worse). But do it like the researcher involved with Cambridge Analytica — via a quiz or game — and you can sneak data out from under the noses of users. Other options include:
- Gathering data via a tracking pixel very few people even knows exists as a concept.
- Deriving data using location services in browsers and IP address probing and machine profiling.
- Accessing personal and biometric data via security features like fingerprint or facial recognition for device activation.
But, as Zuboff notes, you have to do it on cat’s feet:
[These companies] understood early on that if they’re going to get this surplus data, they had do it surreptitiously. They had to do it through what I call the social relations of the one-way mirror. To take without asking. And early on, you look at many of those early patents and you see the scientists actually defining in a very positive way, “We can get data that people did not intend to disclose. We can get data that people don’t even know they disclosed, because we can fit together different bits and pieces and make deductions and inferences. Therefore, we can come up with profiles and insights and patterns about individuals and groups and so forth that people don’t even know they’re giving away and did not agree to give away.”
So from the beginning, for this thing to work to get that behavioral surplus, they had to do it secretly. They had to do it backstage. They had to do it with mechanisms that were designed to keep us ignorant, designed to bypass our awareness.
Yesterday, news broke that Google had secretly placed a microphone in its Nest thermostats, and, now that Google Assistant integration is possible, the microphone will be activated. In a statement, Google said:
Google did not include mention of the microphone when it announced the launch of its Nest Secure system in 2017. The company now says that was an oversight and the mic was included for the purpose of offering expanded features through the system in the future.
In a separate interview with the Commonwealth Club, Roger McNamee, as part of his book tour supporting “Zucked: Waking Up to the Facebook Catastrophe,” notes that these companies are not standing still, and that their practices are “metastasizing”:
What I’m suggesting is the problem is still metastasizing. . . . There’s a tendency — because I first noticed this at the beginning of 2016, and it really came into focus with Black Lives Matter, Brexit, the violations of the Fair Housing Act that I saw in the fall, and then the 2016 election — there’s a tendency to look at this through a rearview mirror and say, “All of these things happened.” Almost all the changes Facebook has made have been about addressing issues that emerged in 2016. The problem here is that there are new platforms — the Internet of Things, smart devices, smart speakers, smart TVs, smart appliances, smart cars — most of them using Amazon Alexa or Google Home as the user interface, and all of them essentially using Android as the operating system. Those things are taking surveillance into places it’s never been before, and they’re changing the nature of it to an “always on” model.
In aggregate, these companies know a lot more than we do about society and people — health, relationships, friends, connections, and more. They likely know your contacts, your brand preferences, and a good portion of your schedule and daily routines.
Yet, these companies are unaccountable and opaque, despite having more knowledge than any paywall has ever protected. As Zuboff says:
Right now, surveillance capitalists sit on a huge asymmetry of knowledge. They have an asymmetry of knowledge, a concentration of knowledge unlike anything ever seen in human history. And with that knowledge comes, as we’ve talked about before, the ability to actually shape and modify our behavior to tune us and herd us toward their commercial outcomes.
There are hints — data portability, the right to be forgotten — that things could change, but only if we demand it. As Zuboff suggests, we need new forms of collective action around the surveillance economy — to demand information about what data are being shared, how they’re being used, and how we can stop it.
Zuboff also believes we need to insist on closed loops — products that give us all the benefits of connected devices without the surveillance. Does your thermostat need to send data to Amazon or Google for it to be accessible to you via an app and to provide analytics? Does your voice command have to be relayed back across the web and logged in a database to play that playlist? I can imagine a time when technology is sold as “Network Safe,” meaning it can exist on a network without sending data to anyone other than its owners.
Another vital part of why activism is important is that the leaders of these companies — namely, Mark Zuckerberg, but others as well — don’t seem to believe in democracy, as Zuboff’s interview with Kara Swisher touches on:
Swisher: I did an interview with Mark Zuckerberg, and one of the things he put forward [was] . . . well, you know, what they’re doing in China, they’re doing all this surveilling, this facial recognition, this and that, I’m thinking “you’d love to do that, Mark Zuckerberg.” But he essentially was putting out the term, “It’s either Xi or me.” . . .
Zuboff: What that statement does is [tell us that he] has given up on democracy. . . . Every generation has to step up to the responsibility to reclaim, to fight, to resuscitate, to maintain the flourishing and the growth and the deeper rooting institutionalization of these ideas. We cannot let this go. Mark [Zuckerberg] has already let it go. He’s a cynic on democracy, but I’m not.
The information sphere is very damaged compared to what it was like 10-20 years ago. Censorship via flooding. Surveillance via surreptitious data acquisition. Psychological manipulation for commercial and political ends.
Our policies and practices need to reflect these new realities.
If Zuboff is right, the entire century ahead may depend on whether we’re willing to fight for the right things now. It won’t be healthy if the 21st century becomes known as the Century of Surveillance.