Most people choose their beliefs. They don't reason to them. Many won't even reject their cherished beliefs just because they've learned something new.
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.Social norms matter immensely in the choice what to believe. The norms of society as a whole are important, though they may not be as strong for a person as the norms of a particular group that is central to that person's identity. My sixth grade teacher told me that evolution wasn't nice; she wasn't even willing to engage on the facts.
This doesn't mean that the truth is socially constructed. It means that social groups construct belief systems, often irrespective of observation and outside the truth. Just because these systems are popular (religious fundamentalism, for instance) doesn't make true-for-them a valid supporting heuristic. It merely makes them well-protected.
The Globe-reported studies found that conservatives, true to form, are often ignoramuses. Many don't care what you tell them. Offering a fact at variance with their beliefs immediately makes you biased. They rely on conservative media to package carefully constructed bullshit that protects their notions from contradiction. It's Fox's job to give its viewers shelter from the well-known liberal bias of reality.
The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire.Liberals as always are better, but we don't have that much to be proud of either.
The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.The Boston Globe story chalks our irrationality up to the wiring of our brains. That's a pretty lame shrug of the shoulders. That wiring hasn't changed, but the degree to which we are polarized has. Yes, it's normal for all people to resist changing their beliefs, to be skeptical of new facts. But it hasn't until the past few decades been normal for conservatives especially to refuse to accept new information.
There has been an asymmetry in the past. Conservatives, by definition, resist change. It's not surprising to find more ignoramuses among them. Liberals are similarly more likely to support change for the sake of change, also not a positive trait.
But the asymmetry in teachability is larger now than at any previous time in my life. I think the crucial difference between now and the 1970s, for example, is that there is a diverse and well-organized propaganda operation dedicated to keeping conservatives away from difficult and dissonant facts.
The new social norm, created by talk radio, conservative think tanks, conservative foundations, Christian fundamentalists, and others, values adherence to the conservative creed over finding out what's actually true about the world.
To say that this is a danger to democracy, as the article does, is a vast understatement. But then, the current crop of radicals who call themselves conservative don't actually value democracy. Instead, they believe that any liberal-ish government is illegitimate and should be brought down as soon as possible. That's even more dangerous.
(h/t Philosoraptor)
1 comment:
Do you know if they measured this effect over time? Some people take time to mull over things and investigate more thoroughly before finally changing their minds. Some call this tendency "faith".
Also, I wonder how objective these facts were. Some people are poor at sifting patterns out of the noise that are there while others sometimes see patterns that aren't there.
Also, did these people have any previous knowledge that contradicted the facts they were given? If so, why would anyone trust what they were being fed by the researchers?
Post a Comment