Execution is everything. The better you run your operations, the better your business is doing....
Why It’s Hard To Learn From Near Misses
The COVID-19 epidemic is a clear example.
Near misses happen all the time, but it seems really hard to learn from them, both at the personal and organisational levels. There might be reasons in how the human brain works that explain this.
If we look at the Covid-19 crisis, there are clear examples of some countries, like the United States and the United Kingdom where the experiences of near misses were ignored. They saw what happened in South Korea, for example, and still decided to act the way they did.
What might partially explain this is the outcome of the near miss South Korea experienced — quick containment and hardly any deaths. This is due to a cognitive bias, called the outcome bias. A great explanation for this is the airline example: If two planes nearly crash into each other, but luckily miss, the pilots might think they saved the day due to their piloting skills, instead of having a major investigation done to sort out the systemic errors that caused the near miss.
As the outcome of a near miss is not having an accident, our bias is to assess the actions that led to it as better than if an accident had happened.
The Outcome bias was first described by Jonathan Baron and John Hershey in 1988. The volunteers they studied were asked to rate other people's decision-making and reasoning in uncertain situations, such as gambling. They gave better scores if the outcomes were favourable — even though chance played a large role in the outcomes.
“As the outcome of a near miss is not having an accident, our bias is to assess the actions that led to it as better than if an accident would have happened.”
There is another bias that affects our view of near misses, the optimism bias.
The first study that revealed the optimism bias, was done by psychologist Neil Weinstein in 1980. He found that when it comes to people’s view about their future prospects, they are “unrealistically optimistic”.
In the study, Weinstein asked over 200 students to rate their chances of experiencing positive or negative life events. Then the students rated the chances of other people in the group experiencing the same events. Not surprisingly, most of the students thought they had better-than-average prospects. Similar studies have been done regarding the cognition of how smart people think they are — most people think they are above average.
Final thoughts
It is easy to imagine how the optimism bias could affect our view of near misses. And of course our view of Covid-19, too. With the outcome bias thrown into the mix, it’s even easier to understand why near misses are rarely taken seriously enough. The bottom line is, that organisations should understand how people work and take it into account when analysing and acting on the near misses they have.
Falcony provides Incident Reporting, Near Miss Reporting and Prevention Tools. If you would like to learn more, Contact us or try our 30-day free trial:
We are building the world's first operational involvement platform. Our mission is to make the process of finding, sharing, fixing and learning from issues and observations as easy as thinking about them and as rewarding as being remembered for them.
By doing this, we are making work more meaningful for all parties involved.
More information at falcony.io.
Related posts
Why learning from failures and mistakes matter in VUCA world?
"I've failed over and over and over again in my life and that is why I succeed." - Michael Jordan
Friction: the number one killer of organisational habits
The resistance that one surface or object encounters when moving over another.
- Definition of ...