The old television police drama, Dragnet, included the catch phrase “Just the facts, ma’am,” that was often used by the detective Joe Friday when questioning (rather misogynistically) women who were telling him too much irrelevant information.
It seems quite difficult these days to get “just the facts” or even figure out what statements are facts.
This past week I read an article in The New Yorker about why facts don’t change our minds. That seems to be particularly relevant in this time of claims about “fake news” and “alternative facts.” The article is about a number of studies done by researchers that show that our minds have limitations when it comes to reasoning about facts.
One study gave participants 25 pairs of suicide notes and, being told that one was real and one was fake, they were asked to distinguish the real from the fake. Half of the notes were truly real, but the experiment actually was meant to examine how randomly telling some participants that they were very accurate in their answers and telling others they were very poor in distinguishing the differences would affect them.
In the second part of the study, they were all told that they had been deceived and that the real point of the experiment was to gauge their responses to thinking they were right or wrong. Now, they were asked to estimate how many suicide notes they had actually categorized correctly.
Those who had been told they had scored high on the first part thought they did significantly better than the average person. This happened even though they knew they had no reason to believe those first results meant anything. “Once formed, impressions are remarkably perseverant,” said the researchers.
This kind of experiment has been done many times with the same results.
You might have read or heard about the term “confirmation bias.” This is the tendency of people to embrace information that supports their beliefs and reject information that contradicts them. If you tend to always watch the news channel that gives you the version of the news you already believe, you are a good example of confirmation bias.
Another experiment described in the article used two groups who had been selected because they had opposing opinions about capital punishment. They were given two studies to read – one with data to support capital punishment as a reasonable deterrent, the other study had data that refuted the deterrence argument. Both studies were fictional.
That group that initially supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing. Those in the other group did the reverse. No surprise?
Did their views change at all at the end of the study? No, in fact, perhaps more surprisingly, the pro-capital punishment people were now even more in favor of it. Those who had opposed it were more opposed.
Based on that study, if an MSNBC news watcher watched FOX news for a day, it would not help them reach a more moderate view or consensus. He would be even more convinced that MSNBC was telling the truth. Confirmation bias leads us to dismiss evidence that goes against our beliefs, and facts don’t change our minds.
This is not a good thing.