“How could they have been so stupid?”
“Why didn’t they show more common sense?”
“They should have seen it coming.”
“The cause was clearly human error.”
“Why didn’t they just follow the procedures?”
“This accident was clearly preventable.”
Do these statements sound familiar? They should. I hear them all the time after an incident and have probably even used one or more of them myself. It is so seductively easy, employing the clarity of hindsight bias, for those not involved in an incident to self-righteously proclaim it to have been “preventable.” Companies even use the clichéd “ALL accidents are preventable” as a motto. There are several problems with this kind of reasoning, and they all have to do with hindsight bias. Before we get into specifics, however, we need to understand just what hindsight bias is and our own capacity for it.
What is hindsight bias?
The term hindsight bias refers to the tendency people have to view past events as more predictable (and thus preventable) than they really were. The APA Dictionary of Psychology defines hindsight bias as “the tendency, after an event has occurred, to overestimate the extent to which the outcome could have been foreseen.” Some refer to hindsight bias as the “I knew it all along” syndrome. Like other biases, we are all susceptible to hindsight bias. It positively reinforces us to believe in intelligence beyond our own reasonable capacity. Since those of us in safety deal with incidents and near incidents on a routine basis we have more opportunities to succumb to hindsight bias than the general public. This is problematic for a variety of reasons.
What makes hindsight bias a problem?
Oversimplification: With hindsight bias you have access to information that is rarely available to those doing the work at the time of the incident. This makes it easy to draw a straight cause and effect line to explain the incident (generally someone’s “fault”), ignoring the myriad of factors (from production pressure to the complexity of procedures) that commonly interact and increase a worker’s propensity to fail.
Fear: When you tell your workers that all accidents are preventable, you are in fact communicating that it is their fault if they experience one. Even under the influence of good intentions, saying that all accidents are preventable is often perceived by workers as a warning that accidents are merely the result of careless workers failing to prevent them. Fear nullifies trust and engagement and has a toxic effect on continuous safety improvement.
Blame: Nothing generates fear like blame – especially when the blame is perceived as unjustified. Managers and supervisors often reward employees for unsafe work habits (e.g., cutting corners to get the job done on time) only to punish them later when an accident occurs. Even in the relatively effective organizations I’ve assessed over the years, most disciplinary actions were taken only after an accident. Prior to the accident the very same “careless” behavior was very frequently condoned or ignored. Hindsight bias often concludes with, “They should have done this…,” placing the blame squarely on the workers shoulders while ignoring contextual problems.
Missed learning opportunities: Conclusions grounded in hindsight bias are often the “obvious” answers and ignore the why of accident causation. There is a deeper and more nuanced story to virtually every incident, going considerably beyond indicating “someone did something stupid.” This is especially true in serious incidents and fatalities. Ignoring the role of interacting factors outside of the worker’s control (e.g., cultural, environmental, etc.) leads to simplistic corrective actions that fail to address root causes and typically results in needless recurrence. Without identifying the root cause, nothing has really been fixed.
Credibility: All work involves at least some degree of risk and all work involving people is subject to inevitable human fallibility and error. Your workforce will inevitably identify incidents that, in their eyes, were NOT preventable.
No one wants anyone, especially themselves, to get hurt. Personally, I hope I never witness or investigate another accident. As much as I may want to prevent incidents, I must embrace reality. Ultimately, they do occur. (However, I can hope for ever smaller numbers). One of the things I’ve tried to do as a safety professional is to help organizations develop protocols to mitigate unplanned incidents – byproducts of “human error”, equipment failure, acts of God, whatever. My preference, therefore, is messaging that promotes specific actions for safety engagement and improvement, as opposed to meaningless and damaging clichés such as “all accidents are preventable.”
My esteemed colleague, Carsten Busch, has written on this topic in a previous SafetyStratus blog, Are All Accidents Preventable, as well as in his excellent book, Safety Myth 101: Musings on Myths, Misunderstandings and More. Both are excellent and important reads which I highly recommended.
Mr. Loud’s over 40 years of safety experience includes 15 years with the Tennessee Valley Authority (TVA) where he served as the supervisor of Safety and Loss Control for a large commercial nuclear facility and later as manager of the corporate nuclear safety oversight body for all three of TVA’s nuclear sites. At Los Alamos National Laboratory he headed the independent assessment organization responsible for safety, health, environmental protection, and security oversight of all laboratory operations.
Mr. Loud is a regular presenter at national and international safety conferences. He is the author of numerous papers and articles. Mr. Loud is a Certified Safety Professional (CSP), and a retired Certified Hazardous Materials Manager (CHMM). He holds a BBA from the University of Memphis, an MS in Environmental Science from the University of Oklahoma and an MPH in Occupational Health and Safety from the University of Tennessee.