FREE WHITEWATER

Daily Bread for 9.29.23: Redeeming Social Media

 Good morning.

Friday in Whitewater will be partly cloudy with a high of 71. Sunrise is 6:50 AM and sunset 6:39 PM for 11h 48m 34s of daytime. The moon is full with 100% of its visible disk illuminated.

On this day in 1789, the United States Department of War first establishes a regular army with a strength of several hundred men.


Yair Rosenberg offers suggestions on How to Redeem Social Media (‘The next generation of platforms doesn’t have to make the same mistakes as the previous one’):

The slow collapse of Twitter has inspired a host of would-be successors. Millions of people are trying out new social-media platforms such as Meta’s Threads in a textbook triumph of enthusiasm over experience. I’m sure that creating free content for a social-media platform run by an unaccountable billionaire will turn out differently this time, we tell ourselves, as though we were all born yesterday.

….

Establish rules of the road.

For years, Facebook permitted Holocaust denial—until it didn’t. For months, Twitter throttled an array of claims about COVID-19—some false, some merely controversial—until it didn’t. Both platforms banned Donald Trump, then reversed course. Meanwhile, Chinese officials who insinuate that COVID-19 began in the United States and work to obfuscate their regime’s horrific repression of Uyghur Muslims mostly go unpunished. The problem is not that social-media companies such as Twitter or Facebook moderate their content. It’s that their process is opaque and seemingly capricious, and the precise basis for decisions is rarely disclosed to the public. Rather than cultivate a healthy online community, this sort of arbitrary administration breeds distrust.

What’s needed instead is a transparent set of detailed criteria governing suspensions, bans, and other punishments that is clearly explained, regularly updated, and consistently applied. Social-media platforms have typically kept the specifics of these determinations private, because they want to avoid opening themselves up to controversy. But making moderation a black box has invited a different form of controversy. Because users could never tell exactly why certain content was taken down or suppressed, the platforms became easy targets for suspicion, paranoia, and accusations of bias. It’s true that no set of public-moderation principles will satisfy all comers. But that’s a feature, not a bug. Users will be able to choose where to spend their time based on whether a platform aligns with their ideals, and platforms will no longer be plagued by users who are constantly aggrieved by their treatment.

(Emphasis added.)

Clarity of rule-making and enforcement benefits platforms and users. 


Man Skydives Straight Onto an Inflatable Unicorn:

Subscribe
Notify of

0 Comments
Inline Feedbacks
View all comments