Is Facebook's Run About to End?

A German decision may mark the beginning of the end for the surveillance economy

Is Facebook's Run About to End?

Facebook is gathering more data about us all the time, as summarized in a recent BusinessWeek article:

[Facebook] tracks people across the internet on other companies’ websites and apps. It uses IP addresses to target ads to people who turned off location-based tracking on their phones. It’s been caught collecting call and text histories from users’ Android devices. It’s stored facial data from people who never agreed to biometric scans. It was just caught monitoring the phone activity of some kids as young as 13. On Feb. 7, Germany’s antitrust regulator was expected to announce the results of a three-year investigation into whether the company has illegally used its market power to coerce data sharing consent.

Reporting strong earnings at the end of last year, it seemed the business juggernaut of Facebook would not be slowing anytime soon — despite all the hand-wringing about Facebook’s culpability in damaging democracies, enabling genocide, and contributing to higher rates of depression and suicide among young people.

But a recent German ruling, which is described under a breathless headline in WIRED reading, “German Regulators Just Outlawed Facebook’s Whole Ad Business,” may prove to be an inflection point. With 32 million active monthly users, Germany is a sliver of the 2 billion users Facebook has. But the ruling is the proverbial camel’s nose under the tent flap, with the president of Germany’s Federal Cartel Office stating:

“Facebook will no longer be allowed to force its  users to agree to the practically unrestricted collection and assigning of non-Facebook data to their Facebook user accounts.”

External regulation seems the only path to moderating Facebook’s power and consumption of data. In his new book, “Zucked,” Roger McNamee, an early investor in Facebook and a long-time friend of Zuckerberg and Sheryl Sandberg, writes about his disillusionment with Facebook’s leadership:

What I did not know when I met [Zuckerberg] but would eventually discover was that his idealism was unbuffered by realism or empathy. He seems to have assumed that everyone would view and use Facebook the way he did, not imagining how easily the platform could be exploited to cause harm. He did not believe in data privacy and did everything he could to maximize disclosure and sharing. . . . He embraced invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence.

Users seem unaware of what Facebook has been doing. Last month, Pew released a survey of Facebook users and their knowledge of and comfort with the data-gathering practices of Facebook, with a special focus on how Facebook uses data to categorize its users.

Of those surveyed, 74% did not know Facebook maintained data on their interests and categorized them into addressable bins. Yet, when pushed to look into the area of Facebook where users can see what Facebook has done, nearly everyone (88% of respondents) found that Facebook had generated some categorization information about them. For most, these categorizations were deemed accurate. Still, there wasn’t a high level of comfort with the practices, with 58% saying they are not comfortable with Facebook compiling this information.

Here’s a direct link to this space in Facebook, to spare you from digging around. Be sure to click on “Advertisers,” where you’ll see a list of:

. . . advertisers . . . running ads using a contact list they or their partner uploaded that includes info about you. This info was collected by the advertiser or their partner. Typically this information is your email address or phone number.

Facebook is a hub of third-party data — that is, gathering marketing data other businesses hold about you, and blending these into their data stores.

As Pew writes in its study of Facebook data and the company’s algorithms:

These categories might also include insights Facebook has gathered from a user’s online behavior outside of the Facebook platform. Millions of companies and organizations around the world have activated the Facebook pixel on their websites. The Facebook pixel records the activity of Facebook users on these websites and passes this data back to Facebook. This information then allows the companies and organizations who have activated the pixel to better target advertising to their website users who also use the Facebook platform. Beyond that, Facebook has a tool allowing advertisers to link offline conversions and purchases to users – that is, track the offline activity of users after they saw or clicked on a Facebook ad – and find audiences similar to people who have converted offline. (Users can opt out of having their information used by this targeting feature.)

Overall, the array of information can cover users’ demographics, social networks and relationships, political leanings, life events, food preferences, hobbies, entertainment interests and the digital devices they use. Advertisers can select from these categories to target groups of users for their messages.

There’s a lot to unpack here. You’re being targeted as a data source by more businesses than you probably realize, and these data are being centralized via Facebook. Facebook is also gathering data from sites you’re visiting, via the login features sites use, which you may or may not see. Consent is not granted, but it is assumed. A pixel or web beacon is all it takes to get the ball rolling.

Such targeting makes it easier to harass people on Facebook. In a recent survey from the Anti-Defamation League (ADL), more than half of Americans reported being harassed or stalked online, with younger people experiencing a higher occurrence.

Of those who experienced online harassment, 20% said it was the result of their gender identity, 15% the result of their race or ethnicity, 11% their sexual orientation, 11% their religion, 9% their occupation, and 8% a disability.

Facebook’s data on multicultural affinity is not very accurate. In the Pew survey:

Of those assigned a multicultural affinity, 60% say they have a “very” or “somewhat” strong affinity for the group they were assigned, compared with 37% who say they do not have a strong affinity or interest. And 57% of those assigned a group say they consider themselves to be a member of that group, while 39% say they are not members of that group.

A 60% accuracy rate is not tremendous. I’d wager that many of us could generate more accurate affinity data just walking down the street and noting various things about passersby. Or while people-watching in the airport. And our little game on the street or in the airport would not automatically be ingested and processed by a machine that is often used to harass and embarrass others.

The culpability of publishers in Facebook’s data gathering was the subject of an earlier post, where I urged publishers to stop using Facebook login. Pixels and beacons should also be dropped, I’d argue, because they are feeding data into a sloppy and abusive data monstrosity.

But people aren’t about to give up Facebook en masse. A recent survey found that people love Facebook so much, they’d have to be paid $1,000 to quit it for a year. Facebook derives about $200 per user in revenues, so there’s a 5x gap between revenues and perceived value.

By these measures, Facebook isn’t the most valuable piece of online infrastructure. In 2017, people said they’d need to be paid about $3,600 per year to stop using online maps, about $8,400 to give up using email, and about $17,500 to live without online search

As long as we’re hooked to free services that trade in our data, we’ll continue to fuel data acquisition and data trading initiatives that tread on privacy, allow online targeting, and encourage large-scale surveillance.

German regulators seem to be setting the table for a major test of GDPR, public opinion, and the surveillance economy. Does it amount to the beginning of the end of Facebook’s surveillance advertising model? Time will tell. But given everything we know, having practices like these curtailed via regulation may be something we look back on and celebrate.


Subscribe now