It has many names — sowing confusion, flooding the zone, muddying the waters — but the effect is the same. Whatever you call it, some actions, choices, behaviors, systems, and strategies introduce noise into the communications system. Other approaches introduce controls to minimize or eliminate noise, striving for pure signal.
The signal:noise ratio is a bedrock concept of communication technology, and extends to communication theory. When the noise in a communication system begins to distract from or overwhelm the signal, and it’s done intentionally, it can amount to a novel form of censorship, as I wrote about back in 2018:
Instead of the traditional censorship of shutting down the “sender” side of the communication paradigm, flooding works by confounding the “receiver” side. Its practitioners can claim to be allowing or even promoting communication and freedom, while effectively disallowing the kind of coherent conversations or consolidated understanding that might lead to actual empowerment and change.
This was from an essay entitled, “Is Scholarly Publishing Self-Flooding?”
Since then, it’s become clear to me that we’re on a path to merely scrapbooking science, unless we deviate from practices of posting and distributing everything scientists care to put into some media format.
The issue of “noise” is going to be a theme for a while, and that’s good, because it bears examination. The impetus is Daniel Kahneman, of “Thinking, Fast and Slow” fame, who has a new book out entitled, “Noise: A Flaw in Human Judgment,” which he co-wrote with Olivier Sibony and Cass R. Sunstein.
While intentionally injecting noise into the communications environment has become the strategy of demagogues and propagandists, noise can also be introduced inadvertently or by poorly designed systems. Noise isn’t the same as bias — it’s not intentional or unintentional directional slant. Rather, it’s merely variability or unreliability that occurs when you want as little of those attributes as possible.
In a recent interview, Kahneman discussed with Kara Swisher what it takes to manage and reduce the noise in a decision-making system — to improve what he calls “decision hygiene.” What he describes might sound awfully familiar:
Kahneman: An example of decision hygiene is when you have different people looking at a problem, you want them to remain independent of each other until the very end. The more independent they are, the more information you get.
Swisher: So you want to have independent judgments and then aggregate them together?
Kahneman: Aggregate them at the end. When you’re looking at a problem, you want to delay intuition. You want to delay a global view of what the issue is until a lot of information has been processed because otherwise, we tend to jump to conclusions.
This sounds a lot like editorially managed peer-review, editorial review, and other forms of review (statistical, technical, ethical) brought together in ways that various journals use as part of their role as trusted intermediaries.
Yet, we keep pumping out unreviewed, unvalidated, untested papers. We keep them available for discovery in perpetuity. We layer on weak reviews and confusing widgets. We generate more noise via redundancy, versioning, overlaying, and preprinting.
We are not a source of peace and quiet, but have become a source of additional noise. (And we seem to have lost track that we’re here to support slow thinking, not fast.)
What does this mean for users, readers, and researchers? Most are withdrawing to sources that are quieter, more trusted, and less noisy. The PDF’s continued popularity may be largely because it’s quiet.
I’ll be reading Kahneman’s book with interest. It’s another shift informing how we proceed from here with our own forms of decision and information hygiene. It looks like another step away from techno-utopianism.