Political Dissonance
Joe Keohane has a fascinating summary of our political biases in the Boston Globe Ideas section this weekend. It's probably not surprising that voters aren't rational agents, but it's always a little depressing to realize just how irrational we are. (And it's worth pointing out that this irrationality applies to both sides of the political spectrum.) We cling to mistaken beliefs and ignore salient facts. We cherry-pick our information and vote for people based on an inexplicable stew of superficial hunches, stubborn ideologies and cultural trends. From the perspective of the human brain, it's a miracle that democracy works at all. Here's Keohane:
A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare -- the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct -- but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)
Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the "I know I'm right" syndrome, and considers it a "potentially formidable problem" in a democratic system. "It implies not only that most people will resist correcting their factual beliefs," he wrote, "but also that the very people who most need to correct them will be least likely to do so."
In How We Decide, I discuss the mental mechanisms behind these flaws, which are ultimately rooted in cognitive dissonance:
Partisan voters are convinced that they're rational⎯only the other side is irrational⎯but we're actually rationalizers. The Princeton political scientist Larry Bartels analyzed survey data from the 1990's to prove this point. During the first term of Bill Clinton's presidency, the budget deficit declined by more than 90 percent. However, when Republican voters were asked in 1996 what happened to the deficit under Clinton, more than 55 percent said that it had increased. What's interesting about this data is that so-called "high-information" voters⎯these are the Republicans who read the newspaper, watch cable news and can identify their representatives in Congress⎯weren't better informed than "low-information" voters. According to Bartels, the reason knowing more about politics doesn't erase partisan bias is that voters tend to only assimilate those facts that confirm what they already believe. If a piece of information doesn't follow Republican talking points⎯and Clinton's deficit reduction didn't fit the "tax and spend liberal" stereotype⎯then the information is conveniently ignored. "Voters think that they're thinking," Bartels says, "but what they're really doing is inventing facts or ignoring facts so that they can rationalize decisions they've already made." Once we identify with a political party, the world is edited so that it fits with our ideology.
At such moments, rationality actually becomes a liability, since it allows us to justify practically any belief. We use the our fancy brain as an information filter, a way to block-out disagreeable points of view. Consider this experiment, which was done in the late 1960's, by the cognitive psychologists Timothy Brock and Joe Balloun. They played a group of people a tape-recorded message attacking Christianity. Half of the subjects were regular churchgoers while the other half were committed atheists. To make the experiment more interesting, Brock and Balloun added an annoying amount of static⎯a crackle of white noise⎯to the recording. However, they allowed listeners to reduce the static by pressing a button, so that the message suddenly became easier to understand. Their results were utterly predicable and rather depressing: the non-believers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun demonstrated a similar effect with smokers listening to a speech on the link between smoking and cancer. We silence the cognitive dissonance through self-imposed ignorance.
There is no cure for this ideological irrationality - it's simply the way we're built. Nevertheless, I think a few simple fixes could dramatically improve our political culture. We should begin by minimizing our exposure to political pundits. The problem with pundits is best illustrated by the classic work of Philip Tetlock, a psychologist at UC-Berkeley. (I've written about this before on this blog.) Starting in the early 1980s, Tetlock picked two hundred and eighty-four people who made their living "commenting or offering advice on political and economic trends" and began asking them to make predictions about future events. He had a long list of questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds. By the end of the study, Tetlock had quantified 82,361 different predictions.
After Tetlock tallied up the data, the predictive failures of the pundits became obvious. Although they were paid for their keen insights into world affairs, they tended to perform worse than random chance. Most of Tetlock's questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals.
So those talking heads on television are full of shit. Probably not surprising. What's much more troubling, however, is that they've become our model of political discourse. We now associate political interest with partisan blowhards on cable TV, these pundits and consultants and former politicians who trade facile talking points. Instead of engaging with contrary facts, the discourse has become one big study in cognitive dissonance. And this is why the predictions of pundits are so consistently inaccurate. Unless we engage with those uncomfortable data points, those stats which suggest that George W. Bush wasn't all bad, or that Obama isn't such a leftist radical, then our beliefs will never improve. (It doesn't help, of course, that our news sources are increasingly segregated along ideological lines.) So here's my theorem: The value of a political pundit is directly correlated with his or her willingness to admit past error. And when was the last time you heard Karl Rove admit that he was wrong?
via scienceblogs.com
Once again, Jonah Lehrer nails it.
1 comment:
I was reading another article recently that this one made me think about. Essentially it had to do with the fact that the more "facts" people are given the less they change their mind, even if these facts are conflicting. I wish I could find the website because I would suggest it, but it was interesting.
Post a Comment