Press "Enter" to skip to content

Overcoming Optimum Bias

Would you cross the road at this 80 KPH location?


What about if a 10 year old child was crossing the road with you?
What about if you knew that two pedestrians have been killed at this location?

Would you stand unsecured on the edge of a 12 storey building?


What would you change in your thinking if someone told you that getting hit by a car travelling at 80kph as you crossed the median strip was equivalent to falling off a 12 storey building?

Sometimes our perceptions and risk assessments look on the bright side of things…

Let’s talk about Optimum Bias

Optimism bias (also known as unrealistic optimism) is a cognitive bias that causes a person to believe that they are at a lesser risk of experiencing a negative event compared to others.

I first heard this term at an Ambulance Service NSW training session on risk assessment, and have not heard anyone refer to the term since… but the concept is so true, and gives a label to an idea I know some trainers and assessors try to raise awareness of in any training and risk assessment they are involved in.

Research says that optimum bias transcends characteristics such as gender, race or age. Each of us have this bias, influenced by the following factors:

  1. Our desired end state (goals)
  2. Our style of thinking and decision making
  3. The beliefs we have about ourselves versus others
  4. Our overall mood

While this concept can help our thinking in positive ways, research shows that it is more likely to have negative effects – leading us to engage in activities that hold more risk, or us not taking the right precautionary measures for safety.

What might this look like in the Emergency Services?

Do you (or someone you know) display any of the following?

  • Responders who tend to focus on finding information that supports what they want to see happen, rather than what will actually happen to them
  • Team members who are overly confident about their skill set… until they find out they are going to be tested on it (then they get quite modest)
  • Officers who believes they will not be harmed in a car accident if they are driving the vehicle because they are in control
  • Officers who tend to think in stereotypes rather than the actual targets when making comparisons, e.g. bad drivers cause crashes (discounting all the average drivers who also are involved in crashers), or low socioeconomic demographics have higher levels of heart disease and heart attacks (discounting all those people of high socioeconomic backgrounds in other areas that also have heart disease)
  • Team members who bases risk assessments on their own specific experiences and feelings, while ignoring what may be experienced by others and the average person
  • Responders who thinks the risk is lower or “it won’t happen to me” until a familiar person, such as a friend or family member, is involved in an incident or has that problem

These are all examples of optimum bias.

What might be key consequences be of not considering our optimum biases when we work?

  • Less effective risk assessment of situations we are putting ourselves and team mates into
  • Not taking enough preventative measures to ensure the safety of ourselves and others
  • Getting ourselves into tricky situations or causing further damage when we overestimate our ability
  • Missing possible causes of situations (or health conditions) from having a selective view of things

What can we do to reduce optimum bias in our work?

Studies have shown that it is very difficult to eliminate optimistic bias; however, raising awareness of and reducing this bias could encourage people to adapt to more risk-aware behaviours. More specifically, we can try to:

  • Keep it close to home: encourage comparison and thinking of ourselves and those close to us. This heightens our emotions and our level of concern about risks
  • Increase experience and exposure: actually experiencing an event leads to a decrease in optimum bias. This is hard to do in many Emergency Service situations – so think about the importance of simulations, scenario based training and purposeful sharing of experiences to help build memory slides in individuals. At the very least having an awareness of the previously unknown will reduce the optimism of “it will never happen to me”
  • Make sure you consider the big picture: stand back and take a situation in – don’t focus too early on specific theories, or don’t eliminate other possibilities until you have more data and observations to give you a good picture of what else might be happening
  • Expect the unexpected: just because you haven’t seen it happen before doesn’t mean that it can’t or won’t happen. Look up “black swan” theory if you are interested in this idea.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: