Tomorrow (Tuesday November 6th) is election day in the US. The talk of the pundits generally focuses on the fact that this is a very close election and, despite rhetoric from both candidates about bipartisanship, the country seems to be extremely politically polarized. The consensus of opinion is that Democrats and Republicans over the last couple of decades have become more homogeneous, more tribal, and more extreme. (Meanwhile the number of people who identify as independents has increased.) For me political campaigns are a massive exercise in confirmation bias – watching both sides spin the same data in completely opposite directions.
There is no shortage of theories as to why this is the case, but there is also the separate question of what can be done to break, or at least moderate, this polarization. In a series of experiments psychologists have found that slowing down the process of evaluating a political question, and engaging people’s abstract thinking, moderates their political views.
In the first series of experiments Preston and Hernandez found that by giving subjects information in a hard to read font their opinions would be more thoughtful and moderate. They gave two groups a description of a defendant in a capital murder case. One description praised the defendant’s character while the other criticized it. They were then given “sketchy” evidence suggesting the defendant’s guilt. The two groups interpreted the ambiguous evidence differently, with those reading a positive description of the defendant’s character less often finding the evidence sufficient to convict.
So the groups were essentially biased in one direction or the other then asked to look at ambiguous evidence and the bias had a measurable effect on their assessment of the evidence. The same effect was seen if the subjects were sorted by pre-existing political ideology rather than the assigned bias of reading a positive or negative character description. However, if the personality and evidence descriptions were given in a difficult-to-read font, this effect was decreased (not eliminated).
One further finding is that if the subjects were under cognitive load (given another simultaneous task) the moderating effect of reading a difficult font was not present.
The researchers interpret all of this as the action of confirmation bias – a core cognitive bias that motivates people to seek out and notice information that confirms existing beliefs and either ignore or dismiss evidence against their existing beliefs or in favor of a competing belief. Confirmation bias is the default mode of human thinking – the cognitive pathway of least resistance that we will tend to follow. If you force people to slow down and think harder, even in a manner tangential to the question at hand, confirmation bias is moderated by deeper evaluation. However – deeper evaluation takes cognitive energy, and if you deprive subjects of this energy by giving them another task to perform, then the default mode of confirmation bias takes hold.
In a more recent series of experiments (paper is behind a paywall, here is the press release) Preston, Yang, and Hernandez looked at attitudes toward building a mosque near ground zero of the 9/11 attacks on the Twin Towers. This time they surveyed subjects for their attitudes on building a mosque in this location, and then gave two groups different tasks – the first was to ask a series of “how” questions and the second a series of “why” questions, but on completely unrelated topics. They found that the “why” group, but not the “how” group, moderated their views on building a mosque. The effect was present for both liberals and conservatives. Preston interprets the results:
“We observed that liberals and conservatives became more moderate in their attitudes. After this very brief task that just put them in this abstract mindset, they were more willing to consider the point of view of the opposition.”
This experiment was not about confirmation bias but rather simply taking the time to consider all points of view on an issue. When people were in the mode of abstract thinking, they were more likely to do this, even on an unrelated topic.
These experiments involve short-term psychological manipulation in order elicit the effects seen. There is no evidence of any long term moderation from any of the manipulations. They do, however, demonstrate very interesting principles – that many people are capable of thinking more deeply and objectively about topics, even those that are highly emotional and political. In these studies external factors were used to increase abstract thinking and reduce confirmation bias in the short term. What if we can internalize these effects in the long term?
Imagine if students were systematically educated to engage abstract thinking and to ward off the effects of confirmation bias (and other biases) when considering important issues (or all issues, for that matter). This, in essence, is scientific skepticism. Skeptics are those who do not simply flow down the path of least resistance, giving in to the lowest energy state of thought, surrendering to cognitive entropy. Skepticism is about understanding the nature of cognitive biases and then doing the hard mental work of thinking complexly and abstractly about important questions.
The trigger for skeptical evaluation needs to be internal. In this way being a skeptic is partly just a habit of thought. The skeptic stops and asks, “wait a minute, is this really true?” When confronting an opposing opinion or interpretation of the evidence, the skeptic tries to understand the various points of view and will at least try to fairly assess each point, recognizing that many topics are complex, with good and bad points on all sides.
Being a skeptic is also about applying the findings of decades of psychological research to our everyday lives. It is a shame that psychologists have conducted thousands of experiments carefully describing the many ways in which human thinking is biased, and yet public awareness of this useful body of knowledge is limited.
As a result, election day will likely be little more than a national exercise in confirmation bias.