In Workplace Safety, How We React Matters

Workplace Safety

Don’t shoot the messenger.” This idiom has been in our vernacular for centuries. The meaning of the phrase is simple – don’t blame or punish the bearer of bad news. While the news may elicit an unwanted emotional feeling, an inappropriate response aimed at the messenger is not an effective method of remaining well informed.

In terms of workplace safety and health, how we react matters (Conklin, 2019). When something is discovered such an at-risk finding or details from an incident, the response to the message is crucial if an organization wants to remain well informed and learn from them. Otherwise, an appearance-based safety model is created where everything looks great until something terrible happens (Quilley, 2010). Having no prior indication something bad would happen, the organization typically reverts to blaming the poor human that triggered the latent errors within the system.

Mike M. Ahlers, a journalist for CNN, published a very interesting article in September 2013 (Ahlers, 2013). Put simply, there was concern due to the number of aircraft near misses having more than doubled from the previous year. The Federal Aviation Administration (FAA), however, had predicted an increase as it phased in a program to monitor radar and automatically report problems. In fact, they were pleased with the results. Why would a federal agency want to see more near misses? First, the FAA determined that it is difficult to use accident data to identify trends since commercial aircraft accidents are rare. Secondly, the FAA focused on precursors to accidents so that they could be studied and proactively addressed to prevent accidents. This was accomplished in several ways. First, incorporating better technology aided in making better near miss determinations that previously relied on reporting by humans. Secondly, and most importantly in my opinion, the agency worked with its workforce and the air traffic controllers’ union to change the safety culture of the agency — emphasizing collection of data over punishment. Under the new non-punitive reporting system, controllers were encouraged to voluntarily report mistakes and problems. Previously, if a near miss was reported, punishment was the norm. Fines or loss of licenses were common occurrences. It was no wonder voluntary reporting was frowned upon. It was stated that the number of near misses did not actually increase, only that the true visibility into the issue became apparent with the change in how reporting was being done. The moral of the story boils down to this – how we react matters.

Another example comes from the construction industry. Similar to the FAA story above, a construction organization saw the wisdom in bolstering precursor data to gain visibility into the effectiveness of their workplace safety controls in the field. Through a robust workplace safety inspection and observation program, staff could report hazards and rate them according to their risk potential. For example, a minor administrative issue on a Job Hazard Analysis (JHA) form or a worker not wearing safety glasses in a hallway would be a low risk potential while a fall from height hazard such as an unprotected hole opening would be deemed a high risk potential. Trending by project, contractor and hazard category allowed transparency and visibility into recurring risk patterns that could be addressed proactively. The program was well received, and field staff were submitting their findings regularly. In the beginning, the high-risk findings were shared quickly with the project team so that prompt action could be considered and communicated. The program was going so well, it was determined that the alerts should be shared beyond the project team. Soon senior staff within the company were added to the distribution of the findings, including those at the highest levels of the organization. However, the response was not quite favorable. Instead of raising awareness or promoting the process, senior staff instead reacted differently. For example, a superintendent submitted a high-risk potential finding and within minutes the poor bearer of bad news was inundated with numerous emails and phone calls. “What’s going on there?!” “How could this happen!?” As this type of response persisted, word got out that it was unwise to submit similar findings. A lesson was certainly learned –senior staff did not want to hear about the risk. As observations were made going forward, they were either undocumented or the risk was downplayed to avoid the unpleasantness of the confrontations that were occurring. Interestingly, the number of high-risk findings decreased 90% on average. However, the risk did not actually diminish. The difference was attributed to the lack of reporting and not in a systematic program to address the risks that were being found and reported. Again, the moral of the story boils down to this – how we react matters.

Comparatively, another construction organization had a similar workplace safety observation program as the one described in the previous story. Staff could report hazards and rate them according to their risk potential. The high or significant risk findings would trigger an alert to the appropriate stakeholders and proactive steps could be taken to mitigate the risk. However, a significant difference in the program occurred to promote these findings. Each week, the CEO of the organization would get a summary of the high and significant potential risk findings from the field. The CEO would then call each staff member that submitted a significant finding and thank them for saving a life. A thank you. Not an admonishment. Not an accusation. Not a threat to ‘shoot the messenger’. A thank you. A verbal support of the person and the process. A positive affirmation to continue reporting, even if the news is not ideal.

How we react to news from the workforce determines our ability to get further news or not. ‘Shooting the messenger’ will certainly not remove the bad news or mitigate the risk. The only outcome from ‘shooting the messenger’ is to remove the ability to be informed. It removes the ability to learn and proactively respond to precursors prior to an unwanted event. Lastly, and most importantly, it erodes all layers of trust between messengers and those receiving the message.

Book Your Consultation Now!

AUTHOR BIO

Cary

Cary comes to the SafetyStratus team as the Vice President of Operations with almost 30 years of experience in several different industries. He began his career in the United States Navy’s nuclear power program. From there he transitioned into the public sector as an Environmental, Health & Safety Manager in the utility industry. After almost thirteen years, he transitioned into the construction sector as a Safety Director at a large, international construction company. Most recently he held the position of Manager of Professional Services at a safety software company, overseeing the customer success, implementation, and process consulting aspects of the services team.

At SafetyStratus, he is focused on helping achieve the company’s vision of “Saving lives and the environment by successfully integrating knowledgeable people, sustainable processes, and unparalleled technology”.

Follow @cary: Linkedin | Twitter

Bibliography

Ahlers, M. M. (2013, September 12). US aircraft near misses more than double. Retrieved May 20, 2020, from https://www.cnn.com/travel/article/aircraft-near-misses/index.html

Conklin, T. (2019). The 5 principles of human performance: a contemporary update of the building blocks of human performance for the new view of safety. Santa Fe, NM: Pre-Accident Investigation Media.

Quilley, A. D. (2010). Creating & maintaining a practical based safety culture. Sherwood Park, A.B.: Safety Results Ltd.

Your Complete, Cloud-Based Safety Solution

An online, integrated platform to protect your team,
reduce risk, and stay compliant

Contact Us