Common cognitive biases series (part 1)

Cognitive biases are systematic errors in thinking and reasoning resulting from the way our brains function. Our brains often rely on mental shortcuts, or heuristics, in order to quickly process information and react. Whether it’s detecting patterns, ignoring unnecessary distraction, or filtering out information, these thinking patterns are believed to have served evolutionary purpose in order to enable us to quickly react to our environment.

On the flip side however, cognitive biases result in a subjective perception and interpretation of facts. And because we’re largely not aware of them and they are automatic, they can impair our judgement and decision-making abilities.

In his 2011 book, Thinking Fast And Slow, psychologist and Noble laureate Daniel Kahneman introduced the concept of our two thinking minds: system 1 (fast thinking) and system 2 (slow thinking).

The fast thinking system, which cognitive biases are a feature of, works with little effort on our end. It relies on intuition and feelings to produce quick judgements and reactions, and is subject to little voluntary control. We use this system when we drive and make snap decisions.

The system two thinking is deliberate and analytical. It requires concerted effort and relies on complex thinking patterns such as evaluation and calculation. We employ this type of thinking when we solve challenging problems and work on complex projects. It’s a more conscious engagement of our brains as opposed to the more automatic one supporting the fast thinking system.

While both systems serve different functions and purposes, the cognitive biases which our fast thinking minds employ leave us vulnerable to decision-making errors. Being familiar with some of the more common cognitive biases and interrogating our reasoning processes can help us make better decisions.

In the words of businessman and investor Charlie Munger, Warren Buffet’s closest partner and right-hand man, and a prominent advocate of mental models:

“It’s remarkable how much long-term advantage people like us have gotten by trying to be consistently not stupid, instead of trying to be very intelligent.”

In a series of articles, I will discuss some of the more common cognitive biases, their implications and some thoughts on how we could avoid them.

 

The Dunning-Kruger Effect: we don’t know what we don’t know and what we know

We tend to overestimate our knowledge and abilities. In fact, there’s an inverse relationship between competence and confidence—the less we know about a particular field, the more likely we are to evaluate our knowledge of it highly. Conversely, people with high skill in a particular domain tend to underestimate their competence.

Psychologists Justin Kruger and David Dunning who first described the phenomenon also argued that we are ignorant of our ignorance—a concept they called meta-ignorance—because our lack of competence is in the realm of the “unknown unknowns”. In other words, the less familiar we are with a particular area the more restricted our mental maps and frame of references about it are.

Implications

We may be oblivious to our unique strengths because when we have a high level of competence in a particular area we assume that what’s easy for us is also easy for others. This may cause us to suffer impostor syndrome and feel that we’re not worthy of the success we have.

We may be more likely to take greater risks in areas we’re less familiar with.

When it comes to growing and learning, if we overestimate our competence, we may be less likely to seek continual improvement.

The benefits of overcoming it and some thoughts on how to

If we don’t know what we don’t know, it’s difficult to self-correct. And if we don’t know what we know, it’s difficult to assess ourselves objectively. Fostering curiosity and remaining open-minded to new information and our assessment of ourselves and others is invaluable—not just because it can help us avoid the pitfalls of our cognitive biases, but also because it exposes us to the joy of continuous discovery.

Anchoring bias: initial information confines the scope of our thinking

The anchoring effect is a cognitive bias that pertains to our tendency to rely too heavily on an initial piece of information (the anchor) when making decisions. Whether the anchor is the price of a rental, the first idea you have about a new project, or a goal you set, it is a piece of information that creates a frame of reference and a context within which your thinking occurs. As such, anchoring confines and converges our thinking around the initial anchor.

In one scientific experiment, two groups of participants were asked what percentage of African countries were part of the United Nations. Prior to that each group witnessed a roulette wheel being spun. The wheel landed on 10 for one of the groups and on 65 for the other.

Remarkably, there was a correlation between the number participants saw on the wheel and their guesses about African UN members. On average, the people who saw the wheel stop on 10, estimated that 25% of African countries were part of the UN; while those who saw it stop on 65 gave an answer of 45% on average.

Implications

Anchoring bias has implications for pricing and purchasing, cost negotiation and innovation. If most services a company offers are priced between $550 and $1,500, a service that costs $400 may seem like a bargain, even though objectively it’s too expensive.

The listing price of a rental will influence how much we’re willing to offer for it.

Our ideas about what’s possible and what isn’t anchor our creative thinking. We may be less likely to embark on very novel ideas if they haven’t been tested.

The benefits of overcoming it and some thoughts on how to

Transcending the limits which the anchoring effect imposes on our thinking can help us make better decisions. When we deal with creative, financial or purchase situations, it’s helpful to consider if there’s an anchor that our thinking may be revolving around. Consulting others and seeking feedback is also helpful.

Cognitive dissonance: we avoid conflicting ideas and concepts

Conflicting thoughts cause psychological discomfort in us (i.e. cognitive dissonance) which our brains mitigate by  a bias for consistency in the ideas, beliefs and values we hold. As such, we are likely to ignore evidence that doesn’t conform with our thinking or even reason against inconsistencies in order to maintain cognitive harmony.

Implications

We may reject or argue against information that doesn’t fit with our views—not because the information is false, but because it makes us feel uncomfortable.

We may resist improving our skills and competencies in order to avoid the cognitive dissonance associated with working on ourselves, especially if we’re experiencing impostor syndrome.

The benefits of overcoming it and some thoughts on how to

If we receive negative feedback about a new product we’re working on, our instinct may be to dismiss it. But most likely our product stronger will be stronger if we factor in the negative feedback.

Similarly, if we’re writing an article, we may tend to ignore information that doesn’t support the point we’re trying to make. But some of the most powerful ideas are conflicting and counter-intuitive.

Accepting the short-term discomfort that cognitive dissonance creates and learning to be with it rather than push it away, may improve the quality of our thinking and actions.