Motivated Reasoning as a Barrier to Rationality

Leibniz's diagrammatic reasoning.
Leibniz's diagrammatic reasoning. (Photo credit: Wikipedia)
Many freethinkers, skeptics, and atheists strive to embrace reason and to follow the evidence where it leads. We face many barriers in doing this, and the most important are probably those that reflect the limitations of our minds. Thus, it probably makes sense that we should try to better understand these limitations and be prepared for how they are likely to lead us astray.

Social scientists have a term that captures phenomena we atheists are quite used to seeing in religious believers: motivated reasoning (also known as motivated believing). Of course, we do it too. And we tend to be much less successful in detecting it in ourselves. Motivated reasoning refers to the fact that emotional and motivational factors tend to influence our decision-making, belief formation, and reasoning processes far more than we typically realize. When we engage in motivated reasoning, we:
  1. believe something because we want to believe it (i.e., it feels good, we like it, etc.),
  2. generate reasonable-sounding justifications for the belief, and
  3. complete this process without realizing that we are doing so.
That is to say, we end up with a set of perfectly reasonable sounding justifications for what we believe that we have assembled after deciding what we believe. And worst of all, we do not recognize that we have done this.

One example of motivated reasoning with which every atheist will be familiar is that of wishful thinking (i.e., I believe X because I really want X to be true). This is a good one because it is probably the simplest and one of the most common we see among religious believers. In fact, some honest and insightful religious believers will, in response to effective questioning, admit that they believe certain parts of their religious dogma because they desperately want various beliefs to be true (e.g., "I will be reunited with my deceased loved ones in heaven"). These beliefs are common, and the motivation for holding them is hardly mysterious.

A somewhat trickier set of examples are those that involve some form of self-deception (i.e., I believe X is true of me because I want X to be true of me and I lack the insight or self-awareness to recognize that X is not true of me). Whereas some people engaging in wishful thinking can eventually recognize that they are doing so, the individual using self-deception will have a much harder time with this. What makes self-deception so fascinating is that it can lead people to perceive themselves in ways that are markedly inconsistent with how they are perceived by others (e.g., the angry feminist who doesn't understand how her behavior is turning people off to feminism).

Motivated reasoning has been used by social scientists to understand how people can deny realities such as climate change, the efficacy of vaccinations, and evolution. Neuroscience shows us that reason and emotion are so closely intertwined that how we feel about ideas tends to color our ability to reason. Compared to our emotional reactions, reasoning is much slower and more deliberate. It requires effort in a way our emotional experience does not. This helps to explain why so we often become emotionally attached to certain ideas before our reasoning has kicked in. When this happens, we find ourselves rationalizing and justifying how we feel about an issue rather than reasoning through the issue.

Our goal becomes one of winning the argument and not one of thinking critically about the issues or following where the evidence leads us. This is where we are most likely to fall prey to all the various cognitive biases that so often get us into trouble (e.g., confirmation bias). When we go down this path, we are pursuing a goal that is different from one of acquiring knowledge or maximizing the accuracy of our beliefs; we are pursuing the goal of self-affirmation. The belief, as irrational as it may be, has become part of our identity and will be angrily defended as such.

In his post on the perils of ideology, Dr. Steven Novella wrote:
Ideology also leads to motivated reasoning, to the marshaling of our impressive cognitive abilities not to find the best answer but to defend the answer that the most primitive and emotional parts of our brain have latched onto. When evidence tends to fit our ideology, we are not motivated to question it. When evidence challenges our ideology, we are very good at finding fault with it.
He also explains how this sort of process often leads people to interpret challenges to their beliefs as threats to their very identity, fueling tribalism.

The closest thing we have to an antidote to motivated reasoning seems to be freethought. As Dr. Novella put it, "I think it is best to consider each question unto itself on its own merits." That is, we need to make an effort to discard our various ideologies and look at the issues one-by-one, evaluating positions based on their merit. We need to learn more effective approaches to reasoning.

By forcing ourselves to slow down and go through an intentional process of evaluating the information as objectively as possible and distancing ourselves from emotionally-laden ideological labels, we put ourselves in a better position to reason. If we are truly following the evidence, motivating reasoning loses at least some of its grip on us.