Illustration: Ashley Siebels
Many people think that the function of human reason is to help people make better decisions and reach more accurate beliefs, largely on their own. Reason would help us overcome the limits of our intuitions — a set of cognitive mechanisms that function quickly and effortlessly, but that is prone to systematic mistakes.
But as social and cognitive scientist Dan Sperber and I argue in our upcoming book, The Enigma of Reason, a mountain of empirical results show that reason performs this function very poorly. Indeed, in quite a few cases it has the opposite effects, driving people toward worse choices.
When people face a decision — like figuring out whom to vote for or trying to solve a problem in the workplace — they overwhelmingly find arguments that support their original hunch, whether it’s right or not.
Instead of objectively assessing the different options, evaluating their pros and cons, and making sure we have good reasons for our decisions, our reasoning mechanisms focus on arguments that support our initial views, and are content with relatively shallow arguments.
This makes it particularly difficult to change our minds on our own. Far from correcting our intuitive mistakes, reason might even amplify them. As people find ever more arguments supporting their initial positions, they are wont to develop stronger attitudes and to become overconfident.
Psychologists call this tendency to find reasons supporting our preordained beliefs the confirmation, or myside, bias. At first, this bias seems to be an unmitigated disaster. But then, why would we have it? As a rule, when our minds are biased, it’s because this bias has some utility.
Sperber and I suggest that the myside bias serves social functions. To understand why this is the case, we must rethink the function of reason.
We claim that reason didn’t evolve because it serves solitary reasoners, but because it helps us interact with others.
Thanks to reason, we can justify our creeds and our behaviors. We can explain to people why something that looks stupid in fact makes sense. And others can evaluate these justifications — if they are good enough, they might reconsider their initial negative judgments.
Thanks to reason, we can also argue for our beliefs. We can provide people with reasons to change their minds and adopt the ideas or course of action we think most appropriate. Again, others can evaluate these arguments and, if they are good enough, change their minds to adopt our stance.
From this perspective, it makes sense that reason should have a myside bias. If you want to justify your actions, or convince someone to share your views, you can only do so by finding reasons that support your actions or views, not reasons that challenge their validity.
However, for reason to serve these social functions, it must also be able to evaluate others’ justifications relatively objectively. If we never changed our minds in reaction to a sensible justification or an apposite argument, the exchange of reasons would be pointless.
If we are biased when we produce reasons, but relatively objective when we evaluate others’ reasons, then dialogues should be fruitful. After exchanging reasons, people should generally end up with a better understanding of each other, and with better ideas more generally. This prediction is supported by a very wide range of empirical studies, from studies of cooperative learning in schools to research on economic predictions.
So what can you do to improve your reasoning at work?
Reasoning works best in interactive settings, in the back and forth of a conversation. This means that reasoning should work best in relatively small groups of four to five people, when it is still possible to have a conversation.
However, there are important trade-offs to consider. Small groups are less diverse, and there’s always a danger that all their members agree with each other. In this case, the conversation is much less likely to be productive. Indeed, it might even be detrimental, as people share reasons that bolster their shared point of view — a phenomenon known as groupthink, or group polarization.
So try to make sure that everybody doesn’t agree, and let people who disagree share their opinions freely, without fear of repercussions. In the absence of spontaneous disagreement, a group member can play the role of the devil’s advocate.
If you’re unable to work with a group, try to make yourself consider other options and anticipate what others might think. For example, if you’re creating a presentation, try to guess what your colleagues will think and incorporate that into your work.
If you imagine yourself having to justify a decision to others — especially others whose opinions and abilities your respect — it can help motivate you to have better reasons. But it’s never going to be as good as having someone else’s actual feedback.
Hugo Mercier is a cognitive scientist working at the Institut des Sciences Cognitives Marc Jeannerod in Lyon, France, and the co-author of The Enigma of Reason.