Why is it so easy for people to ignore science? We have a pandemic tearing its way through the world and a number of national leaders from a vast array of countries are saying things like, “There is no need to wear masks”, or “This isn’t as bad as the media is making it sound.” Cries of fake news, lying statistics, and corrupt political parties (Of course it’s the “other” party who is corrupt) are all blamed for the current situation. It isn’t just in the political domain that these cries of fabrication come from. Religious individuals and faith institutions also believe they cannot gather at full capacity and sing their favorite hymns (without masks of course) because they’re being targeted by the liberal factions of society trying to keep “Jesus” out of the public sector. Some would argue that people today have lost their minds, but I will argue quite the opposite in this post. People haven’t lost their minds, they simply are ignoring how their minds work and succumbing to all the traps that a lack of critical thinking exhibits in these passionate and hyper-emotional times. We don’t ignore the facts as much as use them to create a narrative we like that supports our unconscious biases.
People have an innate need to make sense of the world. They have to have an answer to the question “Why” and when they can’t get one they find themselves in a psychologically uncomfortable state. When we’re confronted with unconnected and diverse facts concerning a particular situation we make sense of these facts by creating a narrative based on our innate biases and tendencies. These biases and tendencies combined with confusing or contradictory pieces of information force us to make sense of the world based on our “gut” feelings. However, those gut feelings can often distort the hard facts, lead us to make poor decisions, and cause us to say some pretty misguided things. By simply wording something in two different ways you can force someone to rely on their gut feeling and passionately defend a position they think very different from another only to find if they look closely, both options say the same thing. They get so caught up in the peripheral information they ignore the facts. We rarely use logic and critical thinking to make our decisions in these cases, we simply go with “how we feel.” Let me give you an example. Read the following scenarios and tell me which one you would select if you had to undergo medical treatment:
Imagine you are a patient with lung cancer. Which of the following two options would you prefer?
- Surgery – Of 100 people undergoing surgery, 90 live through the post-operative period, 68 are alive at the end of the first year, and 34 are alive at the end of five years.
- Radiation Therapy – Of 100 people undergoing radiation therapy, all live through the treatment, 77 are alive at the end of one year, and 22 are alive at the end of five years.
When given these two options 44% of the people in a study said they would most certainly select radiation over surgery. Yet, watch what happens when we just change the wording a little:
- Surgery – Of 100 people undergoing surgery, 10 die during surgery or the post-operative period, 32 die by the end of the first year, and 66 die by the end of five years.
- Radiation Therapy – Of 100 people undergoing radiation therapy, none die during treatment, 23 die by the end of one year, and 78 die by the end of five years.
When I change the wording in the scenarios, if you’re like most people, you will actually prefer surgery over radiation therapy because the numbers start to scare you. In fact, after making the above changes only 18% choose radiation therapy over surgery simply because of how the option is verbally presented. If you look closely there is no difference in the numbers. The statistics stay the same. The facts are the same and objectively you get the same results. Most people see that first set of numbers (How many people live or die) and make a gut decision. When we’re dealing with survival, we become biased and let our gut make the decision instead of using the mental resources that would allow us to focus on the facts.
It’s a simple error in thinking that causes us to say and do things that are just wrong. I could go over a myriad of other processes our mind uses to bypass critical thinking and quickly make decisions but I don’t have the space to do so. In psychology, we call these processes heuristics. They help us think quickly, and for the most part, can be helpful, but can also cause us to make errors. These factors along with the fact psychological studies demonstrate we have a “self-serving bias” where we actively seek information that already supports our opinion and ignore that which contradicts it makes for a perfect storm. However, you can correct this if you are brave enough to do so. Daniel Kahneman writes in his book “Thinking Fast, Thinking Slow” a simple remedy to help us think more critically:
“The way to block errors that originate in System 1 (our intuitive mind) is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2 (your rational conscious critical thinking mind).”
If you find yourself making a decision or judgment hastily, are hyper-emotional about the situation, or being a lazy thinker, stop. Create “cognitive space” so you can think through what you need to consider. In a world where statements can be broadcast across the globe from a keyboard in a matter of minutes, the more often we slow down and think about something before talking about it is essential. It’s the best way to truly appreciate the facts and get past your desire to justify how you feel. If we still care about the truth, we need to train our minds to search for it in the best way possible.