How Heuristics Can Affect Good Decision-Making

Have you ever been caught out by heuristics? I have!

Cowering in a ditch, I knew that there was a good chance I could be killed or seriously injured by the explosion that was just seconds away. While awaiting my self-induced demise, I had a short time to consider my hubris.

I was a bomb disposal officer. I had been trained to deal with dangerous devices and I also had operational experience. So, if I was such an expert, how did I get into this mess? In a critical situation, despite my training, I had made an error in my decision-making.

Pride comes before a fall

Let’s leave me and the ditch for the moment and let me ask you a question:

How do you make good decisions?

Have you thought about the process of making choices? It turns out that, although we can all make decisions, the psychology is quite complicated. If you had asked me that same question back then, early in my career, I would have talked to you about the power of logical thought and how a systematic approach to decision-making would ensure good decisions.

Well, I was learning the hard way that there is more to decision-making than just assessing factors and choosing a course of action. There are also things called heuristics that – when used poorly – can spoil our plans.

Free Personal Action Plan

Just sign up here to receive your free copy

Going out with a bang

First, let’s get back to me in that ditch. In fact, let’s wind back a little and see how I got there.

The day had started well. It was beautiful. The sky was so big and blue I could stare at it and just lose myself there, wrapped in the warmth of the sun, as I waited for a call from my squad. I was doing what I loved, leading my team, trying to make the world a safer place by removing dangerous objects from this magnificent African landscape. And it was fun too, blowing stuff up is fun (until you are caught up in the explosion that is).

On that day there were over a hundred people out scouring mile after mile of the countryside looking for dangerous material. This could be unexploded artillery shells, mortar rounds and even the occasional big bomb. When they found something, they would call me, as they did that day.

I was hailed on my radio and was given a location some miles away. I drove as close as I could in my Landrover 4×4 with my colleague and then we advanced the last mile or so on foot; when the terrain got too difficult. We turned up to find a large pile of artillery shells that needed to be disposed of.

Situational analysis

At this point, our training and experience kicked in. We used our question technique to assess the situation and came up with a plan.

We had been instructed to use the 5Ws to help assess a situation. The 5Ws are the interrogative words of the English language: what, where, when, who and why. The other common interrogative of ‘how’ was generally added to these 5Ws.

The 5Ws would provide a structure to understand the situation. For example:

  • What are we dealing with?  In this case a pile of old artillery shells
  • Why are they there?  They have been fired from guns, but the fuse mechanisms have failed to detonate on impact
  • Where are they? Located in a difficult to access area of bush. So what? We will have to go in and out on foot
  • Who is in danger?  Just my colleague and I; the rest of the area is clear for miles
  • How can they be disposed of?  Correct application of plastic explosive and a manual timed fuse

The answers to the questions informed our plan. And, as we did not have our vehicle nearby, we needed somewhere close that would provide us with some cover. We looked around and chose a small hillock in the distance that looked promising.  We estimated how long it would take us to walk there and then cut the fuse to the correct length.

Bomb disposal: if you see me running, try to keep up

After checking our work, we lit the fuse, checked our watches and set off towards the small hill that would give us cover.  We chatted about important things such as how many letters we had received from home that week and how much we wanted a cold beer.  The funny thing was the escarpment was not getting any closer.  Our pace increased.

We laughed and joked, and we walked briskly along but looking at our watches gave us some cause for alarm.  We broke into a run.  There was no longer any laughing or even chatting.  All that was said was: “We are not going to get there in time, do you see any other cover?”  We spotted what seemed to be a series of gullies over to our left, so we headed towards them.  Upon reaching them our relief quickly turned back to anxiety because the shallow angle of the gully slopes would afford us little cover.  We ran on.  At this point in the proceedings, I sent up a quick prayer, and with only seconds to go we dived into a shallow pit and crouched down with our backs to the sand. We had to compress ourselves to keep our heads below the parapet of the depression.

Going out with a bang?

For a few seconds, the only sound was our thumping hearts, heavy breathing and the noise of a nonchalant fly investigating my hat. Then we felt the explosion – a pulse through the earth and a punch through the air.  We looked at one another.  No words were exchanged but much was communicated.  We were both thinking – that was a bigger bang than expected; we felt dreadfully close!

Next, there was a sound that made me flinch – it was like an angry hornet going past my ear – and then there was another, followed by little thuds and puffs of sand as the shrapnel came down around us.  As the deadly rain struck the ground there was little we could do, so I opted to laugh and my Sergeant used a varied, colourful (but sadly unprintable) string of expletives to express his feelings.

When our self-induced bombardment came to an end, and it was obvious we were both not only alive but also unharmed, we spent a few precious seconds enjoying the quiet.  The same solitary fly, who seemed oblivious to the proceedings, was still taking an interest in my hat.

Not surprisingly the whole experience made me ponder about my decision-making.

The Quest

Understand your values, unlock your purpose, set your priorities, achieve success. Click here to sign up and get the 1st stage for free!

The problem with heuristics and the dangers of bias

I had been trained in decision making and planning so what had gone wrong on that day?

Well in this case one good decision-making tool had been undermined by another. My plan for dealing with the bombs was sound, but it was let down by the simple heuristic that I employed to choose my cover.

When judging distance, I was unknowingly using a scaling heuristic, in other words, I was estimating how far away the hillock was due to its size. The problem is this method only really works well if you have a regular-sized object – such as a person or vehicle – and something to compare it with. In this case, I was looking at a hill – I did not know its actual size – and there was nothing else in the bare landscape to compare it with. The hill was a lot bigger and farther away than I estimated.

“This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”

Daniel Kahneman

What is a heuristic?

A heuristic is a simple decision-making hack or rule of thumb. We use these all the time in our thinking and choices.

One example would be how we choose things when we shop at the supermarket. Most of the things we pick up will be the same items we usually get. If you examine your groceries, most will be from suppliers that you know and regularly use. We do this largely to save us from making endless decisions. If we had to start again every time we went to the shops – not knowing what we liked or could trust – then it would take an age to select each thing.

Considering the bewildering number of choices that are on offer in most shops these days. Without this simple heuristic, we could suffer from analysis paralysis. In other words, without a simple way to make decisions then the processing power of our brains could get overwhelmed by the sheer quantity of data.

That is why manufacturers fight so hard for brand recognition and product loyalty. They know they if they can make you switch to their product then you are very likely to stick with it. That is why they are willing to cut prices and make special offers to tempt you to switch your habits.

And that is just one example of a heuristic. We use these thinking tools in everything from catching a ball to choosing where we sit in a cinema.

Are heuristics good or bad?

Heuristics are not bad in themselves. As mentioned, they are useful mental short-cuts that save us time and generally help us to make quick effective judgements. But each heuristic is a simplified model so it cannot take in all the complexities of a situation. Therefore, heuristics must rely on certain assumptions. Once again, assumptions are not intrinsically bad, but some assumptions can be wrong, or just inaccurate in some circumstances.

That is why we need to be aware of the heuristics we use and when we are using them. Going back to my example, there is nothing wrong with the scaling heuristic. Using relative sizes and distances is a well-known and very useful tool for judging distance. The problem was that I applied the tool bluntly, not considering if any of my assumptions were wrong. The reason for my assumptions being wrong was due to cognitive bias, in this instance confirmation bias (but that is another subject for another post).

Use heuristics but beware of hubris

I had a good process for making decisions (using interrogatives) but in this story, one little mistake nearly cost me my life and that of my colleague. In my case, it was my estimate of distance that undermined my plan. My hubris or overconfidence was enough for me to not examine my assumptions.

That does not mean that the heuristic or the rest of the plan was bad. Far from it. Using heuristics, having a decision-making framework and other planning tools can help us make better decisions.

But, as we plan, we must be cognisant of the heuristics and other processes that we are using, especially if those decisions are important. If we are choosing a coffee, fine we can take a risk and assume the barista knows what they are doing and can make a coffee. But if you have a bigger decision to make, such as getting a builder to extend your house, then it is worth examining your options, not just assuming anyone can do the work just because they say so.

So, if you want to make better choices today ask yourself two questions.

  1. What heuristic or process am I using?
  2. What assumptions am I making and are they correct?

Then you will be on the path to better more effective decisions.

If you want the right answers you have to start with the right questions

About The Right Questions

The Right Questions is for people who want greater clarity, purpose and success. There is a wealth of resources to boost your effectiveness in achieving goals, your leadership of yourself and others, and your decision-making.

Wherever you are on your journey, I hope that you find information on this site to help you on the next leg of your quest. Even if that is just the inspiration to take one small step in the right direction, then that is a success. If you can take pleasure in learning and travelling as you go, then so much the better.

Need help navigating your journey to success?

I love to serve people, helping them unlock their potential, empowering them as leaders, and assisting them in achieving their goals. Please get in touch and let me know how I can support you!

How Cognitive Bias Influences Thinking And Decision-Making

Saying that someone is biased is often used as criticism. We point out when someone obviously favours a person or thing more than another. Bias is synonymous with being prejudiced and that has even more negative connotations. Being prejudiced means having an opinion not based on reason or reality. Thinking of that sort leads to bad judgements. In other words, we associate bias with poor thinking and bad decision-making.

But here is the kicker. We are all biased.

The scientific truth about bias

The definition of cognitive bias is:

“The way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.”

So, bias has an individual flavour to its complexity, but research has shown that there are some common trends as to how our bias manifests. In the ever-increasing body of scientific study about cognitive bias, the best place to start is with Daniel Kahneman’s book, Thinking Fast and Slow.

To explain the title, fast thinking (also called system 1 thinking) is the intuitive, often unconscious judgements that we make. Slow thinking (or system 2 thinking) is the more (seemingly) logical, conscious thought process that we employ. Neither system is better than the other, they are just different. Kahneman’s research shows that both systems are influenced by cognitive biases, and we generally choose to ignore these influences in our decision-making.

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

Daniel Kahneman

Free Personal Action Plan

Just sign up here to receive your free copy

The brain is mind-blowing 

Therefore, bias affects all thinking, and decision-making is just one such thought process influenced by cognitive biases.

The way the brain works when making choices is a marvel of creation. The neural substrates that support our decision-making are not fully understood but, when we make choices, our brain is a lightstorm of synaptic activity, igniting the prefrontal cortex and pulsing out into the hippocampus, posterior parietal cortex and striatum.

Even before we know what we are thinking, our network is leaping into action. Dendrites are stimulated and neurons are firing signals through the axons to other neurons at an astonishing rate.

Ironically, thinking about decision-making like this is mind-boggling!

Thinking of the brain simply as a computer is a poor analogy but, as with computer processing, the brain loves speed. And, to be as fast as possible, firing neurons love to take shortcuts. These shortcuts can help us make judgements at much greater velocities, but as I have already talked about with heuristics, these shortcuts can get us into trouble at times. They have nearly killed me on occasion! Closely linked to heuristics and these shortcuts are cognitive biases.

An example of my biases laid bare

I was wondering how to illustrate the effect of bias on our thinking and decisions and then something happened to me that made me examine many of my own biases. Let me share it with you.

Simon Sinek is one of my favourite writers and speakers and earlier this year Simon Sinek gave a talk about work and how we should love what we do. As ever, Simon’s message was heartfelt and compelling. There was so much I could agree with. I generally do love my work. As a leader, I want the people who work for me to love what they do, and I feel the responsibility for creating that psychologically safe environment where people do feel supported and allowed to flourish.

The funny thing was that I had a little niggle in the back of my head, telling me something was not quite right. So, I watched it again. Afterwards, reflecting on what Mr Sinek was saying, I was able to identify what had got my spider-sense tingling.

What makes a statement true and why do we believe it?

There were a couple of statements that I started to re-examine. The first one is:

“It is a right, it is a God-given right, that we should love where we work.”

Simon Sinek

As Simon says those words my heart is saying “Amen brother!” but my head is saying, “Is that actually true?”

So, I examine the statement again, leaning on the wisdom of others and the power of logical syllogism. The philosopher Karl Popper would start by pointing out that the statement is a non-scientific fact. The statement is not phrased as a logical premise, and it cannot be disproven by scientific means.

If you add theologians into the mix, they will point out that holy books such as the Bible or Koran don’t exactly say that loving work is a God-given right. The emphasis is on loving God and other people rather than work itself.

So, let’s use Simon Sinek’s own advice and “start with why” when we think about his statement. Why does he say that? The statement is actually a rhetorical device, used for emphasis and emotional response. And in those terms, it achieves its ends. But that leaves the question, why do I want the statement to be true, even if it isn’t a fact?

The Quest

Understand your values, unlock your purpose, set your priorities, achieve success. Click here to sign up and get the 1st stage for free!

12+ common cognitive biases in under 2 minutes

This is where we come to the flaws in my processing. My thinking is being influenced by multiple biases at once.

The first thing that I am experiencing is the Halo effect. In other words, I am likely to agree with whatever Simon Sinek says because I like and respect him. I expect him to be right. There is also an immediate anchoring effect too because when I see Simon Sinek, I think of Start With Why, a book I really enjoy. Therefore, I am expecting to like what he says. This leads to an acquiescence bias, so I am more likely to agree with any statement Simon makes. 

Sinek is also a leadership guru and talks about things I care about, so I am also suffering from In-group bias, where I favour other leadership geeks; we are the same tribe. Hot on the heels of these preconceptions is groupthink. The interviewer and the people in the audience all seem to be nodding and smiling. I want to go with the consensus. What’s more, no one is challenging what he is saying so there is also a bystander effect. I am not going to make myself look stupid and say something might be wrong if they all seem to agree with him.

Yes, there are more biases yet!

Next is optimism bias. Simon Sinek is a self-proclaimed optimist, so it is not surprising that his message is alluringly optimistic. Also, I want it to be true – as I want to love all my work all the time – so confirmation bias creeps in. I hold to the belief that we can love work, so I start to suffer from belief bias too. I am also suffering from the just-world hypothesis by expecting things to be fair and for people to get what they deserve. Unfortunately, that is not reality.

That one line of logical fallacy is wrapped up amongst a host of other statements that I agree with and so there is also a framing effect. What I see as the validity of the whole talk influences the context in which I judge any one phrase.

And there is more. As I learn more about leadership, the more I realise that there is so much I don’t know. This is the Dunning-Kruger effect. So, I want to learn, and Simon Sinek is an expert I respect, so I naturally doubt the limits of my own knowledge. When I think about my own abilities, I have a negativity bias and suffer from imposter syndrome; therefore, I doubt myself even more.

The tip of the cognitive iceberg

So, I managed at least 12 cognitive biases in the space of two minutes! And that’s just the ones I can easily identify. There are many more biases and if you would like an introduction to a few more of the common ones then I recommend you visit yourbias.is

So, what am I trying to say? That what Simon Sinek says is bad? No, far from it. I remain a fan even if I cannot agree with every single thing he says. And that is fair enough, I don’t think I would agree with everything that has come out of my own mouth if I could only remember things accurately (rather than suffering from misinformation bias)!

The example I used was to demonstrate the dizzying quantity of cognitive biases that can be at play every time we think about something.

You are biased, but don’t panic. Reflect.

The takeaway point is that bias affects us all, all the time. And the tricky thing about bias is that it is hard to spot in ourselves (although seemingly we are able to spot it more easily in other people!)

“We can be blind to the obvious, and we are also blind to our blindness.”

Daniel Kahneman

If we want to think clearly and make effective decisions, we need to be aware. We cannot avoid cognitive bias but, if you are aware of your thought processes, you can reflect and critique your own thinking.

You can put your newfound knowledge into practice by having another look at the Simon Sinek interview yourself. There is much to enjoy in what he says but there is also at least one more logical fallacy or false statement in there. Can you identify what it is?

And, when you scroll through your social media today think about your reactions to what you are seeing. How are you being influenced? Which biases can you identify in your own thinking?

Congratulations! You are creating the foundations for better thinking and more effective decisions.

If you want the right answers you have to start with the right questions

About The Right Questions

The Right Questions is for people who want greater clarity, purpose and success. There is a wealth of resources to boost your effectiveness in achieving goals, your leadership of yourself and others, and your decision-making.

Wherever you are on your journey, I hope that you find information on this site to help you on the next leg of your quest. Even if that is just the inspiration to take one small step in the right direction, then that is a success. If you can take pleasure in learning and travelling as you go, then so much the better.

Need help navigating your journey to success?

I love to serve people, helping them unlock their potential, empowering them as leaders, and assisting them in achieving their goals. Please get in touch and let me know how I can support you!