Daily Bread for 10.30.21: Facebook Revelations Show a What a Dog-Crap Company It Is

Good morning.

Saturday in Whitewater will be partly sunny with a high of 58.  Sunrise is 7:27 AM and sunset 5:48 PM for 10h 20m 58s of daytime.  The moon is a waning crescent with 32.8% of its visible disk illuminated.

 On this day in 1938, Orson Welles broadcasts a radio adaptation of H. G. Wells’s The War of the Worlds, convincing some listeners that Martians were invading the planet.

 Joshua Benton writes In the ocean’s worth of new Facebook revelations out today [10.25.21], here are some of the most important drops:

It is, a Nieman Lab investigation can also confirm, a lot to take in. Protocol is doing its best to keep track of all the new stories that came off embargo today (though some began to dribble out Friday). At this typing, their list is up to 40 consortium pieces, including work from AP, Bloomberg, CNBC, CNN, NBC News, Politico, Reuters, The Atlantic, the FT, The New York Times, The Verge, The Wall Street Journal, The Washington Post, and Wired. (For those keeping score at home, Politico leads with six stories, followed by Bloomberg with five and AP and CNN with four each.) And that doesn’t even count reporters tweeting things out directly from the leak. I read through ~all of them and here are some of the high(low?)lights — all emphases mine.

Facebook’s role in the January 6 Capitol riot was bigger than it’d like you to believe.

From The Washington Post:

Relief flowed through Facebook in the days after the 2020 presidential election. The company had cracked down on misinformation, foreign interference and hate speech — and employees believed they had largely succeeded in limiting problems that, four years earlier, had brought on perhaps the most serious crisis in Facebook’s scandal-plagued history.

“It was like we could take a victory lap,” said a former employee, one of many who spoke for this story on the condition of anonymity to describe sensitive matters. “There was a lot of the feeling of high-fiving in the office.”

Many who had worked on the election, exhausted from months of unrelenting toil, took leaves of absence or moved on to other jobs. Facebook rolled back many of the dozens of election-season measures that it had used to suppress hateful, deceptive content. A ban the company had imposed on the original Stop the Steal group stopped short of addressing dozens of look-alikes that popped up in what an internal Facebook after-action report called “coordinated” and “meteoric” growth. Meanwhile, the company’s Civic Integrity team was largely disbanded by a management that had grown weary of the team’s criticisms of the company, according to former employees.


If you think Facebook does a bad job moderating content here, it’s worse almost everywhere else.

This was a major theme in stories across outlets. The New York Times:

On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.

For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site.

The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.

“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the Facebook researcher wrote.

“The test user’s News Feed has become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”


The kids fled Facebook long ago, but now they’re fleeing Instagram too.

Also: “Most [young adults] perceive Facebook as place for people in their 40s or 50s…perceive content as boring, misleading, and negative…perceive Facebook as less relevant and spending time on it as unproductive…have a wide range of negative associations with Facebook including privacy concerns, impact to their wellbeing, along with low awareness of relevant services.” Otherwise, they love it.


Apple was close to banning Facebook and Instagram from the App Store because of how it was being used for human trafficking.

From CNN:

Facebook has for years struggled to crack down on content related to what it calls domestic servitude: “a form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception,” according to internal Facebook documents reviewed by CNN.

The company has known about human traffickers using its platforms in this way since at least 2018, the documents show. It got so bad that in 2019, Apple threatened to pull Facebook and Instagram’s access to the App Store, a platform the social media giant relies on to reach hundreds of millions of users each year. Internally, Facebook employees rushed to take down problematic content and make emergency policy changes avoid what they described as a “potentially severe” consequence for the business.

But while Facebook managed to assuage Apple’s concerns at the time and avoid removal from the app store, issues persist. The stakes are significant: Facebook documents describe women trafficked in this way being subjected to physical and sexual abuse, being deprived of food and pay, and having their travel documents confiscated so they can’t escape. Earlier this year, an internal Facebook report noted that “gaps still exist in our detection of on-platform entities engaged in domestic servitude” and detailed how the company’s platforms are used to recruit, buy and sell what Facebook’s documents call “domestic servants.”

Last week, using search terms listed in Facebook’s internal research on the subject, CNN located active Instagram accounts purporting to offer domestic workers for sale, similar to accounts that Facebook researchers had flagged and removed. Facebook removed the accounts and posts after CNN asked about them, and spokesperson Andy Stone confirmed that they violated its policies.

War Of The Worlds – Complete 1938 Radio Broadcast:

Notify of

Inline Feedbacks
View all comments
2 years ago

[…] This libertarian opposes a legislative effort to restrict Facebook’s use of its algorithmic ranking. If Facebookers don’t like the algorithm, it is they — not the government — who should pressure the company to change its practices. If Facebook won’t change, disappointed users should quit Facebook. After all, earlier Facebook Revelations Show a What a Dog-Crap Company It Is. […]

2 years ago

[…] See also Facebook won’t let you control your own news feed and Facebook Revelations Show a What a Dog-Crap Company It Is. […]