Have you ever wondered why some people refuse to change their minds, even when faced with hard facts?
Well, it turns out that “motivated reasoning” is to blame — and we’re all guilty of it. It’s difficult to persuade people with strong beliefs, values, or ideologies.
Even when you present scientific studies, historical evidence, pictures, or videos to back up your argument, they just won’t budge from their position.
It’s human nature. We learn selectively and often look for facts that confirm our worldviews than facts that challenge them.
We push threatening information away — and pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data, beliefs, and values.
Our ability to reason is subject to a staggering number of biases.
As a social psychologist, Jonathan Haidt once wrote, “the reasoning process is more like a lawyer defending a client than a judge or scientist seeking the truth.”
Motivated reasoning is the reason we believe more readily and with less scrutiny than that which we don’t want to believe. It’s our tendency to readily accept new information that agrees with our worldview and critically analyze and even disagree with that which doesn’t.
It’s the reason even facts don’t change our minds. “A man with a conviction is a hard man to change,” Festinger, Henry Riecken, and Stanley Schacter wrote in their book When Prophecy Fails.
“Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen?
The individual will frequently emerge, not only unshaken but even more convinced of the truth of his beliefs than ever before, ” the authors argued.
When you are influenced by motivated reasoning, your biases and prejudices rise above a certain level and swing your judgments. You pay more attention to information that agrees with your worldview.
“Motivated reasoning is how people convince themselves or remain convinced of what they want to believe — they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs,” writes Julie Beck, a senior editor at The Atlantic.
People who feel emotional about certain issues tend to apply their intelligence in a one-sided, biased way that serves their own beliefs and preconceptions so that they always get the answer they want to see or hear.
Motivated reasoning is wide-spread in society. What’s more, being intelligent and informed can often make the problem worse.
“High levels of knowledge make someone more likely to engage in motivated reasoning — perhaps because they have more to draw on when crafting a counterargument,” argues Beck.
In law, advocates and lawyers for both the prosecution and the defense use motivated reasoning to prove guilt or innocence/doubt respectively. The judge or jury, conversely, use deductive reasoning to pass judgment.
In politics and religion, people with strong beliefs, values, and ideologies dismiss any information that contradicts their original beliefs (a phenomenon to help them avoid cognitive dissonance — the extreme discomfort of simultaneously holding two thoughts that are in conflict.
People who are keen on reducing the discomfort of dissonance double down on their beliefs in the face of conflicting evidence.
The economist J.K. Galbraith once wrote, “Faced with a choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy with the proof.”
Many people are somewhat impervious to new information — preferring the beliefs in which we are already invested. It’s easier and comfortable to reason away contradictions than revising our feelings. This happens because are they are deeply tied to their identity or worldview.
How to minimize motivated reasoning
Changing one’s mind about beliefs and perceptions built overtime is hard work; people prefer mental shortcuts — in this case, having the goal fit their ready-made conclusions.
Cognitive biases cannot be completely eliminated; however, they can be minimized — even though the process requires both a lot of data that challenges our beliefs and the motivation to want to change.
An entrenched idea is hard to change but not impossible.
The good news is, when you minimize motivated reasoning, you develop a better critical thinking skillset, and you will also notice an increase in your ability to achieve better results. Better thinking leads to better decisions, which leads to better outcomes in all areas of your life.
To persuade, or change anyone’s mind, start with an open mind. People who disagree with you have a different narrative than you, and your narrative may be as subjective and flawed as theirs.
Question your own assumptions and beliefs first. To construct a holistic view about anything, aim to understand the big picture. Be particularly critical of sources that support your beliefs.
“If you really want to change someone’s mind on a moral or political matter, you’ll need to see things from that person’s angle as well as your own. And if you do truly see it the other person’s way — deeply and intuitively — you might even find your own mind opening in response,” writes Jonathan Haidt, the author of The Righteous Mind.
Convincing someone to change their mind takes time. In conversations, people have to carefully consider their status and appearance. They want to save face and avoid looking stupid. Don’t expect them to abandon their beliefs just like that because they also run the risk of losing social ties.
Don’t argue to win, argue to learn. As Julia Galef, an expert on rationality, judgment, and strategy so aptly puts it, “Having good judgment and making good decisions, it turns out, depends largely about which mindset you’re in.”
To improve your judgment, learn how to feel intrigued instead of defensive when we encounter new information that contradicts your beliefs.
The real question you need to consider is: What do you most yearn for — to defend your own beliefs or to see the world as clearly as you possibly can?