If you really need to get someone’s attention, just tell them what they want to hear. That’s the conclusion of a fascinating new study put together by the University of California, Santa Barbara. Researchers report people can’t help but embrace and believe favorable information and brush aside statements they would prefer not to be true.
The similar notion that we seek out and prefer information and data that supports how we see the world isn’t exactly new. Technically referred to as confirmation bias, this idea has existed for decades in psychology. Going further back, confirmation bias was even described by the ancient Greek historian Thucydides, who wrote that people “entrust to careless hope” what they wish to be true, thousands of years ago.
This much more recent work takes things a bit further, discovering that all it takes is a single conversation with someone else holding favorable views to reinforce or change one’s own opinions. Conversely, however, when participants heard something they didn’t want to believe, it didn’t sway their beliefs at all.
You only hear what you want to hear
If the world and everything in it were dictated entirely by logic, all decisions would be made according solely to the available evidence at hand in combination with the decision-maker’s own prior relevant experiences. Of course, we all know that logic isn’t always prioritized, and plenty of prior research shows that decision makers are frequently clouded by “motivated beliefs.”
Study authors speculate that motivated beliefs have played a large role in the rise of popular misinformation online and the problem of social media echo-chambers. Additionally, they cite the recent r/WallStreetBets craze earlier this year that saw the price of GameStop stock (GME) rise considerably. There was plenty of information out there clearly showing that GME’s meteoric price rise was artificial and unsustainable, but thousands of Reddit users chose to focus on the words of encouragement found on r/WallStreetBets instead.
To research this phenomenon, a group of participants were recruited and asked to complete an IQ test. From there, everyone was separated into pairs depending on their IQ performance. Subjects with scores above the average were placed together, as well as pairs who scored below the median. Each pair was instructed to have a conversation about a proposition all participants wanted to be true: that they performed well on the IQ assessment.
Interestingly, study subjects who were pessimistic about their performance on the IQ test indeed started to believe they had a much better chance at a high score if they were paired with an optimistic partner. So, even though those individuals were down about their potential test score, as soon as they spoke with someone who told them what they wanted to hear —“I’m sure you did well! You look like a smart person” — their beliefs changed.
Conversely, study participants who believed they did well on the IQ test were not likely at all to start second-guessing their performance if paired with a pessimist. Why? It wasn’t what they wanted to hear.
It’s worth noting that these observations were particularly strong among subjects who scored low on the IQ test, suggesting intelligence may play a role in just how quick people are to latch onto preferred beliefs.
Study authors conclude “the results suggest that bias amplification occurs because people selectively attribute higher informational value to social signals that reinforce their pre-existing motivation to believe.”
“This experiment supports a lot of popular suspicions about why biased beliefs might be getting worse in the age of the internet,” says study co-author Ryan Oprea. “We now get a lot of information from social media and we don’t know much about the quality of the information we’re getting. As a result, we’re often forced to decide for ourselves how accurate various opinions and sources of information are and how much stock to put in them. Our results suggest that people resolve this quandary by assigning credibility to sources that are telling us what we’d like to hear and this can make biases due to motivated reasoning a lot worse over time.”
There is, however, an easy panacea for this tendency. Halfway through the experiment researchers told everyone which IQ group they had placed in (above average or below average). After that, most of the biases developed by subjects during the first half of the study disappeared completely. This last finding is perhaps the most important because it indicates that providing indisputable, reputable information can greatly reduce the problem of motivated beliefs.
As far as applying these findings to everyday life, this study is worth keeping in mind the next time you need to broach a delicate subject. Try to frame the conversation in favorable terms to the other person, and they’ll be more inclined to listen to and absorb what you’re saying.
The full study can be found here, published in the Journal of the European Economic Association.