Saying that someone is biased is often used as criticism. We point out when someone obviously favours a person or thing more than another. Bias is synonymous with being prejudiced and that has even more negative connotations. Being prejudiced means having an opinion not based on reason or reality. Thinking of that sort leads to bad judgements. In other words, we associate bias with poor thinking and bad decision-making.
But here is the kicker. We are all biased.
The scientific truth about bias
The definition of cognitive bias is:
“The way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.”
So, bias has an individual flavour to its complexity, but research has shown that there are some common trends as to how our bias manifests. In the ever-increasing body of scientific study about cognitive bias, the best place to start is with Daniel Kahneman’s book, Thinking Fast and Slow.
To explain the title, fast thinking (also called system 1 thinking) is the intuitive, often unconscious judgements that we make. Slow thinking (or system 2 thinking) is the more (seemingly) logical, conscious thought process that we employ. Neither system is better than the other, they are just different. Kahneman’s research shows that both systems are influenced by cognitive biases, and we generally choose to ignore these influences in our decision-making.
“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
Daniel Kahneman
The brain is mind-blowing
Therefore, bias affects all thinking, and decision-making is just one such thought process influenced by cognitive biases.
The way the brain works when making choices is a marvel of creation. The neural substrates that support our decision-making are not fully understood but, when we make choices, our brain is a light storm of synaptic activity, igniting the prefrontal cortex and pulsing out into the hippocampus, posterior parietal cortex and striatum.
Even before we know we are thinking, our network is leaping into action. Dendrites are stimulated and neurons are firing signals through the axons to other neurons at an astonishing rate.
Ironically, thinking about decision-making like this is mind-boggling!
Thinking of the brain simply as a computer is a poor analogy but, as with computer processing, the brain loves speed. And, to be as fast as possible, firing neurons love to take shortcuts. These shortcuts can help us make judgements at much greater velocities, but as I have already talked about with heuristics, these shortcuts can get us into trouble at times. They have nearly killed me on occasion! Closely linked to heuristics and these shortcuts are cognitive biases.
An example of my biases laid bare
I was wondering how to illustrate the effect of bias on our thinking and decisions and then something happened to me that made me examine many of my own biases. Let me share it with you.
Simon Sinek is one of my favourite writers and speakers and earlier this year Simon Sinek gave a talk about work and how we should love what we do. As ever, Simon’s message was heartfelt and compelling. There was so much I could agree with. I generally do love my work. As a leader, I want the people who work for me to love what they do, and I feel the responsibility for creating that psychologically safe environment where people do feel supported and allowed to flourish.
The funny thing was that I had a little niggle in the back of my head, telling me something was not quite right. So, I watched it again. Afterwards, reflecting on what Mr Sinek was saying, I was able to identify what had got my spider-sense tingling.
What makes a statement true and why do we believe it?
There were a couple of statements that I started to re-examine. The first one is:
“It is a right, it is a God-given right, that we should love where we work.”
Simon Sinek
As Simon says those words my heart is saying “Amen brother!” but my head is saying, “Is that actually true?”
So, I examine the statement again, leaning on the wisdom of others and the power of logical syllogism. The philosopher Karl Popper would start by pointing out that the statement is a non-scientific fact. The statement is not phrased as a logical premise, and it cannot be disproven by scientific means.
If you add theologians into the mix, they will point out that holy books such as the Bible or Koran don’t exactly say that loving work is a God-given right. The emphasis is on loving God and other people rather than work itself.
So, let’s use Simon Sinek’s own advice and “start with why” when we think about his statement. Why does he say that? The statement is actually a rhetorical device, used for emphasis and emotional response. And in those terms, it achieves its ends. But that leaves the question, why do I want the statement to be true, even if it isn’t a fact?
12 common cognitive biases in under 2 minutes
This is where we come to the flaws in my processing. My thinking is being influenced by multiple biases at once.
The first thing that I am experiencing is the Halo effect. In other words, I am likely to agree with whatever Simon Sinek says because I like and respect him. I expect him to be right. There is also an immediate anchoring effect too because when I see Simon Sinek, I think of Start With Why, a book I really enjoy. Therefore, I am expecting to like what he says.
Sinek is also a leadership guru and talks about things I care about, so I am also suffering from In-group bias, where I favour other leadership geeks; we are the same tribe. Hot on the heels of these preconceptions is groupthink. The interviewer and the people in the audience all seem to be nodding and smiling. I want to go with the consensus. What’s more, no one is challenging what he is saying so there is also a bystander effect. I am not going to make myself look stupid and say something might be wrong if they all seem to agree with him.
Yes, there are more biases yet!
Next is optimism bias. Simon Sinek is a self-proclaimed optimist, so it is not surprising that his message is alluringly optimistic. Also, I want it to be true – as I want to love all my work all the time – so confirmation bias creeps in. I hold to the belief that we can love work, so I start to suffer from belief bias too. I am also suffering from the just-world hypothesis by expecting things to be fair and for people to get what they deserve. Unfortunately, that is not reality.
That one line of logical fallacy is wrapped up amongst a host of other statements that I agree with and so there is also a framing effect. What I see as the validity of the whole talk influences the context in which I judge any one phrase.
And there is more. As I learn more about leadership, the more I realise that there is so much I don’t know. This is the Dunning-Kruger effect. So, I want to learn, and Simon Sinek is an expert I respect, so I naturally doubt the limits of my own knowledge. When I think about my own abilities, I have a negativity bias and suffer from imposter syndrome; therefore, I doubt myself even more.
The tip of the cognitive iceberg
So, I managed at least 12 cognitive biases in the space of two minutes. And that’s just the ones I can easily identify. There are many more biases and if you would like an introduction to a few more of the common ones then I recommend you visit yourbias.is
So, what am I trying to say? That what Simon Sinek says is bad? No, far from it. I remain a fan even if I cannot agree with every single thing he says. And that is fair enough, I don’t think I would agree with everything that has come out of my own mouth if I could only remember things accurately (rather than suffering from misinformation bias)!
The example I used was to demonstrate the dizzying quantity of cognitive biases that can be at play every time we think about something.
You are biased, but don’t panic. Reflect.
The takeaway point is that bias affects us all, all the time. And the tricky thing about bias is that it is hard to spot in ourselves (although seemingly we are able to spot it more easily in other people!)
“We can be blind to the obvious, and we are also blind to our blindness.”
Daniel Kahneman
If we want to think clearly and make effective decisions, we need to be aware. We cannot avoid cognitive bias but, if you are aware of your thought processes, you can reflect and critique your own thinking.
You can put your newfound knowledge into practice by having another look at the Simon Sinek interview yourself. There is much to enjoy in what he says but there is also at least one more logical fallacy or false statement in there. Can you identify what it is?
And, when you scroll through your social media today think about your reactions to what you are seeing. How are you being influenced? Which biases can you identify in your own thinking?
Congratulations! You are creating the foundations for better thinking and more effective decisions.