Flooding Is the New Censorship

Too much information is today's tool for suppressing the truth and confusing the world

I hate to bite the hand that feeds me
So much information.
The pressure’s on the screen
to sell you things that you don't need.
It’s too much
information.
- Duran Duran, “Too Much Information,” 1993

I recently wrote a post about the new threats to information access, namely geofences and filter bubbles, both of which are harder to overcome or detect than paywalls, the oft-maligned source of subscription revenues.

Now, in addition to fences, we have to consider floods and bots and their interactions with bubbles.

A fascinating “Emerging Threats” report from the Knight First Amendment Institute is entitled, “Is the First Amendment Obsolete?” The First Amendment of the US Constitution reads:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

The author, Tim Wu of the Columbia Law School, argues that the First Amendment may be obsolete because “it is no longer speech itself that is scarce, but the attention of listeners.” This realization has led nations, movements, and individuals who want to censor speech to change their practices:

Instead of targeting speakers directly, [the approach] targets listeners or undermines speakers indirectly.

Wu describes three factors that have made flooding-as-censorship possible:

  • The massive decrease since the 1990s in the costs of being an online speaker.
  • The rise of a business model whose currency is the resale of human attention.
  • The rise of the filter bubble, or the tendency of attention merchants to maximize revenues by offering audiences packages of information designed to match their preexisting interests.

The weaponization of advertising platforms was the subject of a recent interview on the Digiday podcast with BuzzFeed’s Craig Silverman, who said:

The infrastructure to target and slice audiences has . . . been weaponized. It’s been a boon for criminals who are stealing billions of dollars a year, for state-sponsored actors who are trying to identify and target audiences with specific messages and then evaluate the effectiveness and adapt to those.

When it comes to using tools weaponized to confuse, demoralize, subvert, divide, and paralyze the attention of audiences, the Chinese and Russian governments have led the way —developing methods of reverse censorship by “flooding the zone” using bots and troll farms to target filter bubbles with distractions, alternative theories, and contradictions.

In China, this has allowed the Communist leaders to avoid outright censorship in many cases (although it still occurs). Rather than banning materials outright, up to two million people are paid to post on social media in a manner that will “distract the public and change the subject.” As the report says:

In an attention-scarce world, these kinds of methods are more effective than they might have been in previous decades. . . . In such an environment, flooding can be just as effective as more traditional forms of censorship.

We’ve seen signs of this everywhere — from a billion fake Facebook accounts to millions of fake Twitter accounts, signs of flooding the zone are everywhere.

Filter bubbles also weaken social cohesion. As Wu writes:

. . . speakers face ever greater challenges in reaching an audience of any meaningful size or political relevance.

The current US President uses flooding masterfully, diverting people in a whipsaw fashion morning, noon, and night with tweets highlighted across cable, network, and online news sites. As Wu writes:

. . . the current American President has seemingly directed online mobs to go after his critics and opponents, particularly members of the press. Even members of the President’s party have reportedly been nervous to speak their minds, not based on threats of ordinary political reactions but for fear of attack by online mobs.

This echoes concerns I’ve heard from both publishers and librarians about voicing opinions about OA and other policy issues. They are concerned the zone will be flooded with objections, and they will have to spend all day fighting with trolls. This fear, like the GOP’s mentioned above, leads to self-censorship. Part of the reason I left the Scholarly Kitchen was that the comment threads had become a space flooded with distraction, objection, and trolling. They remain so today, with a recent post on OA having a remarkable 107 comments, most of them in long threads intended to wear out the author of the post.

Politicians worldwide have learned that flooding the zone is a perfect way to manipulate the information space favorably while appearing to be open and accessible. It’s a paradox until you realize they’re manipulating attention.

We are also flooding our own zone, using these tools indiscriminately. As the Google protesters described it:

I  think we’re definitely encouraged by the powers that be to funnel our  anger and our energy into places that it will not grow into anything  actually powerful. We have to figure it out on our own with each other,  how to actually build power and hold the powerful accountable.

Maybe  social media has been a gift to the powerful and entrenched all along.

I’ve written recently about why corporations would love full OA. I also think entities wishing to flood the zone would love full OA, as well. After all, what better way to confuse people about issues like climate science, nutrition, economics, or politics than to have authoritative-appearing misinformation flooding the zone along with bots and trolls, all targeting filter bubbles people barely realize they’re in.

We need to modernize our thinking. Even more urgently, we need to begin to think ahead. The OA movement remains firmly backward-looking, citing declarations that were made 15 or more years ago as if they are scripture and embracing a mentality that emerged when Silicon Valley was a lovely and promising vision, not a dystopian reality. The world has changed dramatically. We no longer have too little information. Opening information up has led to predatory publishers, fake news, bots, troll farms, and more bad things. Our attention is being attacked and subverted as a new form of censorship, using information abundance against us.

The responsible thing to do is to build up systems, expertise, and venues that are not subject to these new information manipulation techniques. What do those look like? Trusted brands as signals of quality. Trusted writers and editors with the resources to do the hard work. Trusted, professional journalists incentivized to seek the truth. People and outlets deserving of our attention.

The business model is key to this. With subscriptions rebounding, there is hope. The subscription model aligns readers and producers, gives recipients power, and allows the pursuit of quality over quantity. There is a lot at stake. As Jaron Lanier wrote nearly a decade ago:

Would the recent years of American history have been any different, any less disastrous, if the economic model of the newspaper had not been under assault? We had more bloggers, sure, but also fewer Woodwards and Bernsteins during a period in which ruinous economic and military decisions were made. . . . Instead of facing up to a tough press, the administration was made vaguely aware of mobs of noisily opposed bloggers nullifying one another. . . . The effect of the blogosphere overall was a wash, as is always the case for the type of flat open systems celebrated these days.

If we’re going to make progress, we need to protect the information zone so it isn’t a flood zone. That may mean breaking up dominant companies like Facebook and Google. It may mean returning to business models that aren’t invitations to weaponize information and attack attention. It may mean paying more often for information we can trust, rather than believing free information is inherently good.

No matter the solutions, clinging to vestiges of a past of policy developed on the notion of idyllic open information is no longer tenable.