In the realm of safety reporting, understanding the cognitive processes behind decision-making and...
Overcome Cognitive Biases and Make Safer Decisions: a book summary of Thinking Fast and Slow by Daniel Kahnemann
Thinking, Fast and Slow is a book by Daniel Kahneman that was published in 2011. The book distinguishes our thinking into two parts: "System 1" and "System 2."
- Anchoring bias: This is the tendency to rely too heavily on the first piece of information you receive (the "anchor") when making a decision. For example, if you are trying to decide how much to pay for a used car, the initial asking price might anchor your thinking and influence your final offer. Safety managers can help workers avoid the anchoring bias by encouraging them to consider a wide range of information and perspectives when making decisions.
- Framing effect: This is the tendency to be influenced by the way in which information is presented. For example, if you are presented with two options and one is described in positive terms (e.g., "90% chance of success") and the other is described in negative terms (e.g., "10% chance of failure"), you might be more likely to choose the first option, even if the two options are actually the same. Safety managers can help workers avoid the framing effect by being transparent and objective when presenting information.
- Availability heuristic: This is the tendency to judge the likelihood of an event based on how easily you can recall similar events from memory. For example, if you hear about a plane crash on the news, you might think that plane crashes are more common than they actually are, because the event is fresh in your memory. Safety managers can help workers avoid the availability heuristic by providing them with accurate, up-to-date data and information.
- Confirmation bias: This is the tendency to pay more attention to information that confirms your existing beliefs and to ignore or downplay information that challenges those beliefs. For example, if you believe that a certain safety measure is unnecessary, you might be more likely to focus on evidence that supports your belief and to ignore evidence that suggests otherwise. Safety managers can help workers avoid confirmation bias by encouraging them to consider a diverse range of perspectives and to seek out disconfirming evidence.
- Representativeness heuristic: This is the tendency to judge the likelihood of an event based on how closely it resembles a typical or representative example. For example, if you are asked to estimate the probability that a person is a librarian based on the fact that they wear glasses and are quiet, you might be influenced by the stereotype of librarians as being studious and bookish, and overlook other relevant information. Safety managers can help workers avoid the representativeness heuristic by encouraging them to consider all relevant information when making decisions.
- Overconfidence bias: This is the tendency to overestimate your own abilities or knowledge. For example, you might be overconfident in your ability to operate a piece of machinery safely, or in your understanding of a complex process. Safety managers can help workers avoid overconfidence bias by encouraging them to be humble and to seek out training and guidance when needed.
Conclusion
If you're looking for an incident reporting platform that is hyper-easy to use, ticks all the boxes for anonymity, and two-way communication, has built-in workflows for multiple use cases and more, test drive our incident reporting platform or contact us for more information!
We are building the world's first operational involvement platform. Our mission is to make the process of finding, sharing, fixing and learning from issues and observations as easy as thinking about them and as rewarding as being remembered for them.
By doing this, we are making work more meaningful for all parties involved.
More information at falcony.io.
Related posts
Cognitive Dissonance - How management may not accept the truth of Safety Incidents
In the realm of workplace safety, acknowledging the occurrence of safety incidents is a crucial...
From slips and lapses to knowledge-based mistakes: A summary of 'Field Guide to Understanding Human Error' by Sidney Dekker
Hey there safety manager! Are you looking for a book that explains why all of us (even the best of...