Overcome Cognitive Biases and Make Safer Decisions: a book summary of Thinking Fast and Slow by Daniel Kahnemann

Thinking, Fast and Slow is a book by Daniel Kahneman that was published in 2011. The book distinguishes our thinking into two parts: "System 1" and "System 2."

In "System 1", Kahneman discusses how people make decisions quickly and automatically, without much conscious thought. This system is useful for making quick decisions, but it can also lead to biases and errors in judgment.
 
In "System 2", Kahneman discusses how people can use slower, more deliberate thinking to overcome the limitations of System 1. He also discusses how people can become more aware of their own cognitive biases and learn to make better decisions.
 
For safety managers, Thinking, Fast and Slow can be a useful tool for understanding how people make decisions and how to avoid biases and errors that can lead to accidents. It can also help safety managers understand how to create an environment that promotes safe decision-making, and how to encourage workers to think more critically and deliberately about safety.
 
Safety managers can benefit from reading Thinking, Fast and Slow in a number of ways. Here are a few examples:
 
Understanding cognitive biases: The book discusses various cognitive biases that can lead people to make poor decisions, such as the anchoring bias, the framing effect, and the availability heuristic. By understanding these biases, safety managers can be more aware of them in their own thinking and in the thinking of their workers, and take steps to mitigate their influence..Here are a few examples of cognitive biases that are discussed in Thinking, Fast and Slow:
  1. Anchoring bias: This is the tendency to rely too heavily on the first piece of information you receive (the "anchor") when making a decision. For example, if you are trying to decide how much to pay for a used car, the initial asking price might anchor your thinking and influence your final offer. Safety managers can help workers avoid the anchoring bias by encouraging them to consider a wide range of information and perspectives when making decisions.

  2. Framing effect: This is the tendency to be influenced by the way in which information is presented. For example, if you are presented with two options and one is described in positive terms (e.g., "90% chance of success") and the other is described in negative terms (e.g., "10% chance of failure"), you might be more likely to choose the first option, even if the two options are actually the same. Safety managers can help workers avoid the framing effect by being transparent and objective when presenting information.

  3. Availability heuristic: This is the tendency to judge the likelihood of an event based on how easily you can recall similar events from memory. For example, if you hear about a plane crash on the news, you might think that plane crashes are more common than they actually are, because the event is fresh in your memory. Safety managers can help workers avoid the availability heuristic by providing them with accurate, up-to-date data and information.

  4. Confirmation bias: This is the tendency to pay more attention to information that confirms your existing beliefs and to ignore or downplay information that challenges those beliefs. For example, if you believe that a certain safety measure is unnecessary, you might be more likely to focus on evidence that supports your belief and to ignore evidence that suggests otherwise. Safety managers can help workers avoid confirmation bias by encouraging them to consider a diverse range of perspectives and to seek out disconfirming evidence.

  5. Representativeness heuristic: This is the tendency to judge the likelihood of an event based on how closely it resembles a typical or representative example. For example, if you are asked to estimate the probability that a person is a librarian based on the fact that they wear glasses and are quiet, you might be influenced by the stereotype of librarians as being studious and bookish, and overlook other relevant information. Safety managers can help workers avoid the representativeness heuristic by encouraging them to consider all relevant information when making decisions.

  6. Overconfidence bias: This is the tendency to overestimate your own abilities or knowledge. For example, you might be overconfident in your ability to operate a piece of machinery safely, or in your understanding of a complex process. Safety managers can help workers avoid overconfidence bias by encouraging them to be humble and to seek out training and guidance when needed.

New call-to-action

Promoting deliberate thinking: System 2 thinking, which is slower and more deliberate, can be more accurate and less prone to biases than System 1 thinking. Safety managers can encourage their workers to engage in System 2 thinking by providing them with the time and resources they need to make careful decisions, and by promoting a culture of critical thinking.
 
Creating a safe environment: The book discusses how the environment can influence people's decision-making, and how it is possible to design environments that promote safe decision-making. Safety managers can use this knowledge to design safer workplaces and to create policies and procedures that encourage workers to make safe choices.
 
Improving communication: The book also discusses the importance of effective communication in decision-making, and how misunderstandings and miscommunications can lead to errors. Safety managers can use this knowledge to improve their own communication with workers and to encourage clear communication throughout the organization.
 

Conclusion

 
Overall, Thinking, Fast and Slow is a valuable resource for understanding how people make decisions and how to avoid biases and errors that can lead to accidents. It can help safety managers understand how to create an environment that promotes safe decision-making, and how to encourage workers to think more critically and deliberately about safety. By understanding the cognitive biases discussed in the book and taking steps to mitigate their influence, safety managers can help their workers make better, safer decisions.

If you're looking for an incident reporting platform that is hyper-easy to use, ticks all the boxes for anonymity, and two-way communication, has built-in workflows for multiple use cases and more, test drive our incident reporting platform or contact us for more information!

Falcony free trial


We are building the world's first operational involvement platform. Our mission is to make the process of finding, sharing, fixing and learning from issues and observations as easy as thinking about them and as rewarding as being remembered for them.‍

By doing this, we are making work more meaningful for all parties involved.

More information at falcony.io.

Related posts

Why Is Safety Culture Built By Increasing Diversity?

Diversity is a hot topic today. Just try to google the word and see how many hits you get. We are...

VUCA
4 min read

7 Ways to Be a More Efficient Safety Manager

Different businesses have distinct safety issues. However, every safety manager can improve his or...

Safety Management
3 min read

What is diversity of safety in practice?

In the recent blog post of this diversity of safety blog series, we explained why all great safety...

HSEQ
3 min read

Involve your stakeholders to report

At Falcony, we create solutions that multiply the amount of observations and enable our customers to gain greater understanding of what’s going on in their organisations, areas of responsibility and processes.