10 cognitive biases that distort our thinking during the corona crisis

Arcay
The Minimalist

--

Photo by Josh Riemer on Unsplash

We humans have a collective misbelief about our ability to think and act rationally. In reality, our thinking and decision-making are influenced by so-called cognitive biases, a kind of unconscious mental shortcuts. In the course of evolution, we have developed them to save time and energy; cognitive biases help us make lots of decisions quickly and automatically. But under certain circumstances they lead to perceptual distortion, inaccurate judgment, and illogical interpretation. For example when we face danger, when we deal with complexity, and when we are in situations that are entirely new to us. Our cognitive biases take over, and without us noticing it, our thinking and decision-making become irrational and ineffective in a quite systematic way.

It is some of these systematic thinking traps I describe in this article: How 10 cognitive biases affect the way we think, decide, and act during the corona crisis.

1.Belief Bias | “People tend to have an unconscious belief that big problems require big solutions,” states Professor Steven Taylor, in his book The Psychology of Pandemics. This idea of a sort of a balance between problem and solution can be explained by belief bias — our tendency to evaluate conclusions and accept them as accurate because they are believable rather than because they are fact-based or logically valid.

When The Public Health Agency tells us that the big problem we face should not be met by big but by small and seemingly trivial solutions, such as washing our hands, coughing into our elbows, and staying home if we feel sick, we experience a cognitive conflict. We want bigger actions. And the most important thing is that the actions are perceived as big as the problem, not what the actions will result in — or not result in.

Therefore, we also want to do more to protect our loved ones. Thus, we act in the most forceful way we can and hoard an abundance of much-needed goods such as toilet paper, wheat flour and yeast. Not because the behavior is fact-based or logically valid, but because it feels right.

2.Bandwagon effect | In social and sometimes political situations, we tend to align our beliefs and behaviors to the majority. The implication is that since so many other people are doing it, it must be good, or at least acceptable. So when all neighbors are hoarding food, we feel compelled to do the same. When the stock market plunges, we jump on the bandwagon and start dumping our shares. When most counties in Europe are closing their schools, obviously Sweden should too.

However, the tendency to follow the action of others has nothing to do with what effect the action will have. The bandwagon effect is instead a kind of herd mentality or groupthink. It can be traced to human’s early history, when loyalty between group members — as well as a consensus about what is true and false — was literally a matter of life and death, both for the individual and the group. “Humans have been adapted to survive in a group, not to reach the most accurate perception of reality,” says Professor Mikael Klintman, author of Knowledge Resistance: How We Avoid Insight from Others.

A consequence of the bandwagon effect is that solutions that are easier for the majority to understand will win over solutions that are more creative or complex, even when the easily understandable solutions are proven less effective.

3.Anchoring bias | We humans tend to place an excessive emphasis on the first piece of information we receive. Or to put in another way: the first thing we assess affects our assessment of everything that follows. Also, our initial assessment is not very much influenced by the information we receive later.

One concrete example of anchoring bias is our tendency to judge other people based on the first impression. All changes that we later make to our original assessment is by adjusting the first impression, not by making a completely new assessment.

Anchoring bias affects us in everything from how we evaluate prices to how doctors make diagnoses. And, of course, how we assess the danger of the coronavirus. When the virus was initially presented as the same kind of virus that caused SARS and MERS, which had a case fatality rate of between 10% and 40%, it became the starting point for our assessment of how dangerous COVID-19 is to us. Alternatively, the first assessment was made when WHO Director-General Tedros Adhanom Ghebreyesus announced in early March that the mortality rate was 3.4%.

After that, all information we subsequently have received regarding COVID-19’s mortality, we have assessed in relation to the first information we received, not as entirely new information. Therefore we find it difficult to grasp that the danger of COVID-19 now seems to be considerably lower than we initially thought. A recent estimate by the Centre for Evidence-Based Medicine at the University of Oxford suggests an infection fatality rate somewhere between 0.1% and 0.36%.

4.Framing effect | We humans make different decisions depending on whether the alternatives are presented with positive or negative connotations. Psychologists Amos Tversky and Daniel Kahneman demonstrated this in 1981 with the “Asian disease problem.” A group of subjects was asked to imagine that the United States is preparing for the outbreak of an unusual Asian disease that is expected to kill 600 people.

Half the group had to choose between Program A, which would save 200 people and Program B, which had a 1/3 probability of 600 people being saved and a 2/3 probability of no people being saved. 72% of the participants chose program A. The other half had to choose between Program C, which would mean that 400 people would die and Program D, which had a 1/3 probability that no one would die and 2/3 probability of that 600 people would die. Now 78% preferred program D.

Interestingly, programs A and C are identical, as are programs B and D. When the programs were presented in terms of saved lives, the participants preferred the secure program A (=C), but when the programs were presented in terms of expected deaths, the participants chose the gamble D (=B).

The fact that different people and different governments have different views on which solutions are most effective in protecting the population may, therefore, have to do with how they think about their options. If the connotation is positive (save lives, save businesses, etc.), we tend to avoid risks, but if the connotation is negative (avoid death, avoid unemployment, etc.), we tend to seek risk.

5.Availability heuristic | What we have the easiest to retrieve from our mind and imagine becomes more important and more likely than what we cannot spontaneously imagine or remember. This means, for example, that we are more afraid of flying than of driving to the airport, even though car accidents are far more common than plane crashes. Our assessment error is due to the fact that airplane accidents get a lot of space in the media and are therefore more easily accessible information for our brain.

Most of us, including our politicians, have seen disaster movies where a nasty cloud of infection forces panicked people to take shelter indoors. Consequently, it’s easy to connect that memory with how the coronavirus spreads. But that picture does not match reality. “The airborne infection is best transmitted indoors during winter. There the air is dry, the breathing drops shrink by evaporation and can travel farther from the mouth. Outdoors, the risk of airborne infection is minimal. Non-existent if you keep a distance.” wrote Professor Agnes Wold in Focus on March 26, 2020.

But most of all, availability heuristic affects our thinking since we’re fed with hundreds of images and stories about the coronavirus every day, often focusing on the most affected countries and the sickest people. All of these stories make it easy to imagine the pandemic as even bigger, even closer, and even more dangerous than it is.

Or, as WHO Director-General Tedros Adhanom Ghebreyesus put it: “The coronavirus is a deadly threat. But fear and panic are more dangerous.”

6.Neglect of probability | We humans have a tendency to completely disregard probabilities when making a decision under uncertainty. Small risks are greatly overestimated — or completely neglected. After September 11, for example, many Americans refused to fly because they felt the risk had increased, even though statistically there was no higher risk of flying than before the terrorist attacks. The same applies to the fear that most of us have experienced when swimming in unknown water: that a shark will attack us. Although we know that the likelihood of a shark attack is almost non-existent — the risk of drowning is considerably higher.

Linked to COVID-19, the neglect of probability means that we tend to devote most of our attention to the mere possibility that the worst-case scenario will occur. And very little to the crucial question: How likely is it to happen?

“There’s a tendency to focus on the worst-case scenario,” notes Carl Heneghan, Director of the Centre for Evidence-Based Medicine at the University of Oxford in a BBC Future article on April 2, 2020. For example, the H1N1 pandemic of 2009, known as swine flu, had early case fatality rate estimates inflated by a factor of more than 10. When medics later evaluated cases, the actual H1N1 case death rate was far lower, at 0.02%. “That’s not a cause for complacency. But it could be an antidote to some of the alarm at the very high reported death rates in some countries,” says Heneghan.

7.Negativity bias | We humans are more likely to pay attention and focus on things that are dangerous or negative than ones that are neutral or positive. We do this because it has proven to be a winning concept in our evolution. In the early history of humanity, individuals that paid more attention to the bad, dangerous, and negative had also a bigger chance of survival.

In today’s world, negativity bias makes life more difficult for us by, for example, continuing to remind ourselves of that embarrassing slip of the tongue we did several weeks ago. And how we tend to put much more time and energy into processing criticism than processing praise.

Negativity bias also makes us more receptive to bad news than good news. In fact, bad news is so important to us that we perceive it as more believable and true. Therefore, we are much more interested in bad news, and therefore bad news is much more interesting to report.

In short, we pay a disproportionate amount of attention to everything that is bad for us, which makes the bad seem bigger and more important than it actually is.

8.Information bias | We tend to believe that the more information we get, the better decision we can make, even when the information we seek is not relevant to the decision. Especially in the face of difficult decisions, or when faced with choices that we have no previous experience with, we feel we need to know everything to make the right decision.

But more information does not necessarily lead to better decisions. Many times too much information can distract decision-making from what really matters, and sometimes the search for more information can become more important than both making decisions and taking action.

When the situation is new, and there’s no best practice to adopt, as now during the corona crisis, it is often better to act and learn along the way than to do nothing while waiting for that last piece of missing information.

9.Dunning-Kruger effect | The Dunning-Kruger effect is a distortion of our self-perception that makes us regard ourselves as smarter and more competent than we actually are. Often, only a small amount of knowledge about a subject is enough to make us wrongly believe that we know everything there is to know about it. When we have that conviction, we have climbed the Peak of Mount Stupid.

The Dunning-Kruger effect explains why there seem to be thirteen self-proclaimed corona experts on the dozen. And why we seem to think that knowledge gained from a Facebook group is enough to know what measures are needed to slow down the spread of the virus in the best way possible.

The paradox is, according to psychologists David Dunning and Justin Kruger, that the knowledge and skills required to become good at something are often the same qualities that are needed to realize that we are not good at it. So if we lack that ability, we not only remain ignorant, we also remain unaware of our incompetence.

For that reason, we sometimes find it difficult to realize that other people are much more knowledgeable and competent than ourselves, even when we know they have more insight, education, and experience in the matter.

10.Confirmation bias | We humans face every question, situation, and context with some kind of preconceived opinion. This subjective truth we then want to confirm at almost any cost, since being able to validate our preconceived opinion means we have control over the world around us.

Confirmation bias means that we seek information that confirms previous conceptions or prejudices. It affects how we collect information, where we collect information, how we interpret information, and even how we remember information. Those who, for example, are convinced that COVID-19 is the beginning of the end of the world will seek information and listen to people who support their beliefs. Information that contradicts their belief will not be noticed, or they will interpret the information in a way that confirms their existing opinion. Also, their memory will focus only on the details that reinforce their original belief.

Everyone suffers from confirmation bias. Even if you consider yourself as having an unusually open mind and regularly observing facts before reaching conclusions, it is most likely that preconceived opinions and prejudices shape your thinking nonetheless.

The research was rewarded with the Nobel prize | The research behind our cognitive biases was introduced by Amos Tversky and Daniel Kahneman as early as 1972. Thirty years later, Kahneman received the Nobel Memorial Prize in Economic Sciences. In his book Thinking, Fast and Slow, he summarizes his research and describes our two ways of thinking: our automatic system 1 and our hard-working system 2.

System 1 operates at the unconscious level and is constantly active, while system 2 is switched on only when you really need to focus on something. The dilemma is that our brain wants to avoid using our conscious system 2 for as long as possible because it is so convenient to use system 1. Therefore most of our thinking and decision-making takes place in the unconscious. To quote Emmanuel Donchin, director of the Laboratory of Cognitive Psychophysiology at the University of Illinois: “An enormous portion of our cognitive activity is nonconscious, figuratively as much as 99%.”

Learn how to manage your cognitive biases | Because cognitive biases operate at the unconscious level, we cannot identify them in ourselves (but we can see them in others). Thus, neither you nor I can eliminate the effects of cognitive biases. However, it is possible to train oneself to reflect on them and their impact on our thinking and decision-making.

If you want to practice managing your cognitive biases, you can start with the following five debiasing techniques:

1. Get knowledge. The first step is to know what cognitive biases exist, how they affect you, and under what circumstances. The second is to realize that you will still not be immune; after all, you’re a human being. Should you think that you’re not as affected by cognitive biases as everyone else, here’s a bonus bias only for you: Bias blind spot.

2. Practice metacognition. When you know how you are affected, you can begin to evaluate your thinking and your decisions, that is, start with metacognition: think about your thinking. Go back regularly to how you think about, reason around, analyze, and evaluate different things. Above all, consider how and why you make decisions. Question yourself and your conclusions; in what other ways can you think?

3. Consider options. Make it a habit of exploring and considering other opportunities than the one you think is the best. Be your own devil’s advocate. Approach and listen to people who have the opposite opinion of you and, most importantly, try wholeheartedly to understand why.

4. Appreciate uncertainty. “I don’t know” is an underestimated starting point. Uncertainty is a necessary ingredient on the way to well-grounded and rational solutions. Embrace uncertainty as an opportunity to sharpen your thinking and, above all, to learn from your actions.

5. Keep track of your ego. Reflect on your self-image; how confident are you about your opinions? How much time and energy do you spend proving you’re right compared to finding out what you may have missed? Ask yourself: Why do I think I know this? Try to focus more on exploring other opportunities than on proving your knowledge or status.

A time to be wise | Professor Steven Taylor, the author of The Psychology of Pandemics, wrote in the Independent on March 17, 2020: “The more I read about pandemics, the more I realized that pandemics were essentially psychological phenomena. Pandemics were not simply about some virus infecting people. Pandemics were caused and contained by the way that people behaved. Pandemics are controlled only when people agree to do particular things, like covering their coughs, washing their hands, complying with physical distancing, and getting vaccinated, if a vaccine is available. If, for various psychological reasons, people refuse to do these things, then the pandemic will continue to spread.”

Of course, we must take COVID-19 seriously. It’s a deadly virus. But we can’t let fear and panic take over. If we do not take into account that cognitive biases distort our thinking, the risk is imminent that we do not discover how our decisions become irrational and our actions ineffective. After all, we’re witnessing both a health crisis (the coronavirus) and a social crisis (business, unemployment, etc.).

We need to be able to handle both wisely and effectively.

Photo: CC BY 4.0 Micco Grönholm — mix of CC BY SA 4.0 www.vperemen.com and CC BY SA 4.0 Goran Paunovic. Illustration: CC BY 4.0 Micco Grönholm.

About the Author -

Micco Grönholm — Game changer, enabler, and inspirer. Helping people, organizations, and brands succeed. Currently as Head of Future, City of Helsingborg, Sweden.

Reproduced at The Minimalist with explicit permission under Creative Commons license CC BY-NC 4.0 (https://creativecommons.org/licenses/by-nc/4.0/). Originally published — 10 cognitive biases that distort our thinking during the corona crisis

--

--

Arcay
The Minimalist

I aspire to become a problem solver by recombining existing knowledge in new ways. I write at theminimalist.club