HOME



Digby's Hullabaloo
2801 Ocean Park Blvd.
Box 157
Santa Monica, Ca 90405



Facebook: Digby Parton

Twitter:
@digby56
@Gaius_Publius
@BloggersRUs (Tom Sullivan)
@spockosbrain



emails:
Digby:
thedigbyblog at gmail
Dennis:
satniteflix at gmail
Gaius:
publius.gaius at gmail
Tom:
tpostsully at gmail
Spocko:
Spockosbrain at gmail
tristero:
Richardein at me.com








Infomania

Salon
Buzzflash
Mother Jones
Raw Story
Huffington Post
Slate
Crooks and Liars
American Prospect
New Republic


Denofcinema.com: Saturday Night at the Movies by Dennis Hartley review archive

January 2003 February 2003 March 2003 April 2003 May 2003 June 2003 July 2003 August 2003 September 2003 October 2003 November 2003 December 2003 January 2004 February 2004 March 2004 April 2004 May 2004 June 2004 July 2004 August 2004 September 2004 October 2004 November 2004 December 2004 January 2005 February 2005 March 2005 April 2005 May 2005 June 2005 July 2005 August 2005 September 2005 October 2005 November 2005 December 2005 January 2006 February 2006 March 2006 April 2006 May 2006 June 2006 July 2006 August 2006 September 2006 October 2006 November 2006 December 2006 January 2007 February 2007 March 2007 April 2007 May 2007 June 2007 July 2007 August 2007 September 2007 October 2007 November 2007 December 2007 January 2008 February 2008 March 2008 April 2008 May 2008 June 2008 July 2008 August 2008 September 2008 October 2008 November 2008 December 2008 January 2009 February 2009 March 2009 April 2009 May 2009 June 2009 July 2009 August 2009 September 2009 October 2009 November 2009 December 2009 January 2010 February 2010 March 2010 April 2010 May 2010 June 2010 July 2010 August 2010 September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018


 

This page is powered by Blogger. Isn't yours?

Hullabaloo


Monday, July 12, 2010

 
How Do We Think?

by digby

This article in the Globe over week-end is fascinating and you should read the whole thing. It contains info we've talked about before, but synthesizes it in a new way. I've excerpted some of what I thought were the more interesting passages for you to skim below.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

[...]

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

[...]

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

Here's where the difference between conservatives and liberals come in. It's not a huge difference, but it is significant:

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

The political scientists have some theories about how to fix this, but the seem like pretty thin gruel to me. Apparently people who are secure are more open to new information so if we could make everyone feel good about themselves this could be dealt with. (I'm guessing if that could be done on a mass scale we might not even need democracy anymore.)

And the answer doesn't seem to be more education. It turns out that even those who are factually right about 90 percent of things are so confident that they are completely unwilling to correct the 10% they are wrong about.

One of the suggested cures is to shame the purveyors of information, the media, with fact checking. But the author rightly observes that the media is shameless.

It turns out that our brains are designed to create "cognitive shortcuts" to cope with the rush of information which I'm guessing is more important than ever in this new age. I'm also guessing one of these "cognitive shortcuts" is trusting in certain tribal identification and shared "worldview" to make things easier to sort out, which is why things are getting hyperpartisan and polarized in this time of information overload. (And sadly, one of the effects of that would be more confirmation of whatever bad information exists within the group.) So politics becomes a dogfight in which the battle is not just between ideas, but between the facts themselves.

We've seen the beginnings of a sophisticated manipulation of this effect during the Bush administration's experimentation with epistemic relativism. (We are seeing it today with the obsession with deficits as well.) I wonder if democracy is up for this?

.