How Open Becomes Closed

Algorithms, propaganda, and asymmetrical information systems seem set to ensure it

How Open Becomes Closed

Scholarly publishing is living in the past, I’m afraid. We’re still arguing about things that are really no longer relevant.

Take OA. Not only is there no agreement yet, after 20 years, of what this means, but it’s becoming mooted in an information landscape that has commodified nearly everything. In fact, it’s potentially feeding into a growing information dystopia.

As Sara Kendzior recently wrote:

. . . the ability to control the attention of the media/people is much more powerful than outright censorship.

This refers to the Trump administration’s ability to change the conversation so rapidly that we forget consequential information, and live instead in a constant storm of information confetti. This is further enabled by a hyper-targeted set of information services that keep people from having a common conversation.

But the ability to control the attention of people isn’t limited to propagandists and demagogues. Algorithms we barely understand are in operation on Facebook, Twitter, YouTube, and Google, and are potentially shaping (or deforming) attention as much as anything.

These algorithms have created viciously fast feedback loops. There’s now a spooky kind of minute-by-minute uniformity that occurs across news sources. A recent survey of journalists found that more than 3/4 use Twitter to monitor competitors, leading to a mimicry effect that is profound. Stories flip not just here and there to suit the inflammation of the day, but flip everywhere, coordinated by Twitter-watching and “monkey see, monkey do” reflexes.

In aggregate, it becomes a form of closing off information, swamping it. We forget about global warming, the latest mass shooting, the local tax issue. And we may never hear about the latest scientific research. The information environment is worse than overheated. It is dominated by whatever generates the most clicks.

Scholarly discovery tools are falling prey to these same forces. At Charleston, I heard about the appropriately but surprisingly non-ironically named “Rabbit Hole” discovery service, which is nascent and doesn’t work well. Nevertheless, the idea is common — leverage Altmetric to generate a list of articles for users.

Yet, Altmetric is mostly driven by Twitter mentions, which amplify content not because tweets are organically popular but because they are algorithmically valuable to the advertising business of Twitter.

Twitter only became profitable when it began to manipulate its feed with algorithms, a factor that was missed in a prominent study, and which remains underappreciated in its significance when it comes to Twitter’s reliability.

Plan S seeks to make human curation of the literature a much less viable business, with the result being a literature that is highly commodified and therefore available to any service to use as it sees fit (with attribution).

With Google, Twitter, YouTube, and Facebook controlling our attention while being leveraged by propagandists and demagogues to bury the truth, spread lies, and mislead the population, what leverage will scholarly publishers have in an environment when they have no commercial levers to pull in order to promote facts, establish important connections, and reveal new findings to important large and niche audiences of specialist knowledge workers?

Just a question that crossed my mind today.