The truth hurts: People who can't use facts to win an argument use untestable morals
Have you ever argued with someone who seemed to have a comeback for everything you said? Your fact-based arguments didn’t convince them. Neither did the fact what you were saying is common sense. Actually, it just seems like they’re grasping for whatever argument will stand against what you say. This happens because people will go to great lengths to qualify their worldviews, regardless of whether or not it’s true. And a new study now finds that when facts don’t support these views, these people will tread into territory we can’t prove wrong, Medical Daily reports.
“Our new research… examined a slippery way by which people get away from facts that contradict their beliefs,” Troy Campbell and Justin Friesen, authors of the new paper, wrote for Scientific American. “Of course, sometimes people just dispute the validity of specific facts. But we find that people sometimes go one step further and… they reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue.”
This happens because people need to feel like their worldviews matter, regardless of what the truth is. The researchers argue that these behaviors are the basis of political polarization. After all, previous research has found that when our beliefs are challenged, we stick to what we believe even more. This unwillingness to hear out another belief further engrains us on our side of the argument — and the more steadfast we are, the more likely we are to think the opposing view is inferior.
For their study, the researchers conducted four experiments in which the first two tested how challenging a person’s beliefs had them arguing in untestable ways. One of them, for example, involved showing 174 participants who either opposed or supported same-sex marriage supposed facts that either bolstered or refuted their opinions. When they were shown facts that opposed their views, regardless of their stance, they were more likely to say it wasn’t about the facts but rather about their own moral opinions. They were also more likely to say their opinions were based on fact rather than morals when the statements supported their beliefs. “In other words, we observed something beyond the denial of particular facts: We observed a denial of the relevance of facts,” the researchers wrote.
For the other two experiments, Campbell and Friesen looked at how people viewed their beliefs when they discovered they were testable rather than untestable. In one of the experiments, some participants were told President Obama’s policy performance could be empirically testable while others weren’t told anything; they were then asked to rate his performance in five areas, such as job creation. When the researchers compared supporters and opponents responses, they found that, among those who were told his performance was testable, the amount of people who fell on either extreme dropped by 40 percent.
The fourth experiment also showed how people will bolster their beliefs especially when they don’t have a factual basis to rely on. When 103 religious participants were told that God’s existence would never be testable, those who were highly religious were more likely to express stronger beliefs toward religion, such the belief God was looking out for them. Those who were highly religious and told God’s existence would one day be proven or unproven were a little less likely to report such sentiments.
Overall, these findings show how strong bias can be in the face of facts, and suggest the ways in which political and ideological polarization occurs. “We’ve learned that bias is a disease, and to fight it we need a healthy treatment of facts and education,” the researchers wrote. “We find that when facts are injected into the conversation, the symptoms of bias become less severe. But unfortunately, we’ve also learned the facts can only do so much.”