Why, in an age in which we have the world’s information easily accessible at our fingertips, is there still so much widespread disagreement between people about basic facts? Why is it so hard to change people’s minds about truth even in the face of overwhelming evidence?
Perhaps some of these inaccurate beliefs are the result of an increase in the intentional spreading of false information, a problem exacerbated by the efficiency of the Internet. But false information has been spread pretty much since we’ve had the ability to spread information. More importantly, the same technologies that allow for the efficient spreading of false information also provide us with the ability to fact-check our information more efficiently. For most questions we can find a reliable, authoritative answer easier than anyone has ever been able to in all of human history. In short, we have more access to truth than ever. So why do false beliefs persist?
Social psychologists have offered a compelling answer to this question: The failure of people to alter their beliefs in response to evidence is the result of a deep problem with our psychology. In a nutshell, psychologists have shown that the way we process information that conflicts with our existing beliefs is fundamentally different from the way we process information that is consistent with these beliefs, a phenomenon that has been labeled "motivated reasoning." Specifically, when we are exposed to information that meshes well with what we already believe (or with what we want to believe), we are quick to accept it as factual and true. We readily categorize this information as another piece of confirmatory evidence and move along. On the other hand, when we are exposed to information that contradicts a cherished belief, we tend to pay more attention, scrutinize the source of information, and process the information carefully and deeply. Unsurprisingly, this allows us to find flaws in the information, dismiss it, and maintain our (potentially erroneous) beliefs. The psychologist Tom Gilovich captures this process elegantly, describing our minds as being guided by two different questions, depending on whether the information is consistent or inconsistent with our beliefs: “Can I believe this?” or “Must I believe this?”
This goes not just for political beliefs, but for beliefs about science, health, superstitions, sports, celebrities, and anything else you might be inclined (or disinclined) to believe. And there is plenty of evidence that this bias is fairly universal—it is not just a quirk of highly political individuals on the right or left, a symptom of the very opinionated, or a flaw of narcissistic personalities. In fact, I can easily spot the bias in myself with minimal reflection—when presented with medical evidence on the health benefits of caffeine, for instance, I eagerly congratulate myself about my coffee-drinking habits. When shown a study concluding that caffeine has negative health effects, I scrutinize the methods (“participants weren’t randomly assigned to condition!”), the sample size (“40 college-aged males? Please!”) the journal (“who’s even heard of this publication?”), and anything else I can.
A bit more reflection on this bias, however, and I admit that I am distressed. It is very possible that because of motivated reasoning, I have acquired beliefs that are distorted, biased, or just plain false. I could have acquired these beliefs all while maintaining a sincere desire to find out the real truth of the matter, exposing myself to the best information I could find on a topic, and making a real effort to think critically and rationally about the information I found. Another person with a different set of pre-existing beliefs may come to the opposite conclusion following all of these same steps, with the same sincere desire to know truth. In short, even when we reason about things carefully, we may be deploying this reasoning selectively without ever realizing it. Hopefully, just knowing about motivated reasoning can help us defeat it. But I do not know of any evidence indicating that it will.