"The Digitization of Mistakes"

When mistakes can go viral and context is broken, everyone becomes suspicious

"The Digitization of Mistakes"

Jeff Bezos’ response to blackmail attempts by the parent company of the National Enquirer is a story worth following. It has a bit of everything, and may mean prison for some people. But, as Scott Galloway recently stated, there’s something broader afoot that is affecting a lot of people not in the news.

In a special episode of the Recode “Pivot” podcast, Galloway described a core element of the Bezos/AMI scandal as “the digitization of mistakes” — how teens, young adults, and adults now have to deal with the fallout from moments of bad judgment or emotional vulnerability.

In a prior era, evidence of these moments was hard to gather, and even harder to broadcast. Cameras weren’t everywhere. When there were pictures, they were shared clandestinely perhaps among a few people — a wayward Polaroid or a snapshot that captured something odd in the background. Taking pictures surreptitiously was difficult. Selfies (and those of your private parts) were difficult to take, and even more difficult to see through to completion — after all, unless you had a darkroom, someone was going to see them as they were developed.

Reproducing an embarrassing photo you glimpsed was also difficult — you needed the negatives, or to take and develop a picture of the picture. Even if you had the negatives, your reach was still limited by a variety of factors — the time and cost of development, the mail, and so forth.

Now, everyone has the negatives, and everybody has access to a worldwide broadcast platform — one that never forgets.

By contrast, while what’s going on with the government in the Commonwealth of Virginia shows that old, traditional photography and printing can still derail careers. But from a technology perspective, this scandal seems somehow both overdue and  quaint.

The prevalence and permanence of words and pictures captured by smartphones, and broadcast either privately (the people think) or semi-publicly, have generated new fears and worries that are now being baked in at the generational level. As a result, the “digitization of mistakes” — the capture and broadcast of fairly normal or lightly weird but still embarrassing behavior — has led to skyrocketing rates of anxiety, depression, and suicidal behavior among teens, especially girls and young women.

Part of the untold story here Galloway touches upon is that two-bit versions of the blackmail being attempted by AMI on Bezos is surely going on all the time in schools and universities, as well as other places where people gather or compete. It occurs with far lower monetary or reputational stakes, but with perhaps a greater level of vulnerability felt by those so victimized. Bezos has billions, and is safe from any real threat here. He can go on the offensive with little risk. But a developing personality in a school where she or he is already feeling insecure, with doubts about parental support, school support, friend loyalty, personal definition, and more — well, to have a bully threaten to release a topless or bottomless photo, evidence of drinking, a picture of illicit drug use, or just something embarrassing (a bloody nose, a bad hair day, an ugly smile) can be devastating.

This kind of bullying even affects successful, strong adults. Sara Wachter-Boettcher is an accomplished expert in digital privacy, and the author of a fantastic book I reviewed last year, “Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech.” Wachter-Boettcher was invited to give a talk at a Google office about online privacy. Afterwards, the talk was posted on YouTube. The ensuing harassment she received from sexist trolls was predictable, rapid, and demoralizing. This, along her experience with the entire span of online garbage, has changed her view of technology — it used to be fun and enjoyable, but now it is miserable and exploitative. As she writes in the recent issue from McSweeney’s entitled, “The End of Trust”:

It’s not that technology broke my trust — at least not at first. But it broke my context: I don’t know where I am. . . . This used to feel freeing: I didn’t have to choose. I could simply exist, floating in a mix-and-match universe of my own design. But left unchecked for so long — by shortsighted tech companies, and by my own petty desires — the lack of context bred something sinister: a place where everyone’s motives are suspect. I don’t know who’s watching me, or when they’re coming for me. But I do know they’re there . . .

The digital world has changed. It is no longer simply fun. The pond has flipped, and what was once a novelty inside normalcy has become a new information establishment where civility, privacy, and personal autonomy are the novelties. The new establishment was built by people and companies out to exploit others, and there are few signs it will change soon.

We need to set publication practices and policies accordingly — which is why it bothers me that we’re still pursuing strategies around access to information hatched 20 years ago. This is why the idea of unfettered access to unvalidated or untested scientific papers via preprint servers seems unwise to me. This is why stripping funding from editorial processes strikes me as an unforced error. This is why underfunding Western academic libraries during an unprecedented rise in papers from China makes no sense. Things have changed, as people who are trained to handle and expected to handle information carefully are more important than ever. “Shortsighted tech companies” aren’t reliable, nor are their business models or attitudes. Motives matter. Editors, professionals, and ethics matter. Information and community integrity matter.

What makes sense now that we’ve seen what we’ve seen? We need to adapt, and one place to start is for us to move beyond rehashing ideas from 1998. If we don’t hit “pause” and have a serious rethink, we are also culpable in the digitization of mistakes. Only the digitization of our mistakes — or the implications of having our ideas removed from context — may be uniquely devastating.

Let’s be careful out there.