Good morning.
Monday in Whitewater will be sunny with a high of 76. Sunrise is 5:32 AM and sunset 8:29 PM for 14h 57m 24s of daytime. The moon is new with none of its visible disk illuminated.
Whitewater’s Library Board meets at 6:30 PM.
On this day in 1955, Walt Disney dedicates and opens Disneyland in Anaheim, California.
Joshua Benton writes When it comes to misinformation, partisanship overpowers fact-checking, over and over again (‘Why do people fail to update their beliefs in light of clear evidence to the contrary? Our research provides an answer: partisanship is a powerful factor that can lead people away from accuracy’):
Yes, a lot of misinformation spread on Facebook; yes, a lot of people got a lot of their political news from dubious sources. But we know now that people’s brains don’t have an on/off switch that gets flipped by a well-made factcheck. People’s beliefs are driven by a huge number of psychological and social factors, far beyond whether they follow PolitiFact on Instagram. Knowledge alone doesn’t knock out beliefs held for deeper reasons — and sometimes, it entrenches them more deeply.1
You can see those extra layers of nuance in a lot of the academic research in the field. Like in this paper, which came out in preprint recently. It argues exactly what it says on the tin: “Partisans Are More Likely to Entrench Their Beliefs in Misinformation When Political Outgroup Members Fact-Check Claims.” Its authors are Diego A. Reinero (Princeton postdoc), Elizabeth A. Harris (Penn postdoc), Steve Rathje (NYU postdoc), Annie Duke (Penn visiting scholar), and Jay Van Bavel (NYU prof).
Here’s the abstract:
The spread of misinformation has become a global issue with potentially dire consequences. There has been debate over whether misinformation corrections (or “fact-checks”) sometimes “backfire,” causing people to become more entrenched in misinformation.
While recent studies suggest that an overall “backfire effect” is uncommon, we found that fact-checks were more likely to backfire when they came from a political outgroup member across three experiments (N = 1,217).
We found that corrections reduced belief in misinformation; however, the effect of partisan congruence on belief was 5× more powerful than the effect of corrections. Moreover, corrections from political outgroup members were 52% more likely to backfire — leaving people with more entrenched beliefs in misinformation.
In sum, corrections are effective on average, but have small effects compared to partisan identity congruence, and sometimes backfire — especially if they come from a political outgroup member. This suggests that partisan identity may drive irrational belief updating.
In Whitewater (as in places across America) in years before the pandemic and in years since, it’s proved difficult to move people from false claims. During the pandemic in Whitewater, it would have been a herculean task to move one side or another from its pandemic position. Mere recitation of statistics was ineffectual (and believing otherwise was obtuse).
These are challenging times requiring a slog.
Monday reflections from Meow the Cat:
.