Demystifying Risk with Carsten Busch

Episode Transcript

Stephenie Langston: 

Hi everyone, thank you for joining us today. Hi Carsten, thank you for joining me today. Before we jump into our conversation on demystifying risk, could you tell us a little bit about yourself, your career in safety, and maybe a little bit of insight into why you’re referred to [as] the “Indiana Jones of Safety?”

Carsten Busch:

Yes, thanks for inviting me, Stephenie. It’s a pleasure to be here. A bit about myself…Currently, I’m working as a senior advisor of occupational safety for the Norwegian Police Force. My background isn’t police at all. I used to say everything I know about police is from television, so it’s probably not a really reliable knowledge at all. By training I’m a mechanical engineer, and when I was at the technical college I hated it within three months, but luckily, I came in a practical year where I worked in safety and quality and that’s how I got into safety 29 years ago (almost). So, that’s quite a while. Most of my working life I’ve been in various railway organizations [in the] Netherlands and in Norway, and I had a brief stint in offshore, which was interesting, but not really my sector. Besides working as a safety professional in various organizations I’m very interested in the development of knowledge in our industry, in our line of work. I’m also pretty interested in the professionalism and raising that. I’m a member of the Dutch Society of Safety Science, which I’ve been in for almost all those 29 years. At some point I’ve read a lot, and I wrote a bit before I came with some books, and I thought, “Let me professionalize my knowledge a bit.” So, I went studying and I drifted off (accidentally, perhaps) in the more historical part of safety, which interests me because I’m interested in the development of knowledge. So, I did some archaeology and my thesis for university was about Herbert Heinrich, the safety pioneer from the 20s and 30s. I did some real archaeology on his work, and also other ancient safety literature. At one point a peer of mine, I regret [that] he died two years ago, nicknamed me the “Indiana Jones” because I did all the digging of the “safety gems” from long ago. So, that’s how I got the name and I kept it because I was honored, but I don’t run around with a whip or anything.

Stephenie Langston:

Maybe a hat. We’ll have to get you a hat.

Carsten Busch: Sometimes I need a hat, as you can see here. I need protection.

Stephenie Langston:

Well, thank you very much for that introduction. Part of this series is helping professionals develop themselves. We’re trying to provide experts [with] valuable pieces of knowledge when it comes to safety, even stealing things from things like process improvement, change management, and general business practices, to help make the safety profession a little bit more resourceful. We are providing another resource for the safety profession. You can’t have too many resources. So, if we can add another voice, that’s why I’m excited to have you today, because I do think that your books as well as your knowledge on the safety industry is invaluable. So, I would like to go forward in looking at (in this particular conversation) demystifying things like risk so that safety professionals can be a little less hesitant when approaching problem solving. So, as an author you’ve published several books on topics like measuring safety myths and (most recently) preventing industrial accidents, which looks at the works of Heinrich. What prompted you to begin your writing career?

Carsten Busch:

Well, as the saying goes, “this moves in little steps,” and I don’t really know what prompted me, but I had been writing for…I think I started a decade ago (or something) with little blogs and then articles and, as I mentioned, I’m me in this Dutch Society for Safety and we have this quarterly magazine where I joined first as a reviewer. For the Dutch society of Safety I started writing reviews of new safety literature. Also, [I wanted] to get some of my peers reading, because there’s so much good stuff out there and I didn’t have the impression that many people really worked on development. They went to a course or they went to college [for a] degree, and then they thought they were finished for the rest of their life, and it doesn’t work this way. The world is changing, and the profession is changing, and the world around the profession is changing. So, I think we should be on the lookout for new stuff all the time, and also old stuff to see what we can learn from that. So, I started writing there and I provided some articles, I started writing some blogs. At some point [a peer of mine], who also passed away (really great guy, so we miss him,) he came up with the idea of “What if you and me and Cary (your colleague at SafetyStratus) do a book on safety myths?” because we had been discussing these myths of “all accidents are preventable” and stuff like that. We had been discussing them and he proposed, “Shall we do a book together and just be discussing from three sides?” So, we started working on it. I was probably working on it the hardest, because we didn’t get really anywhere except for some discussions, and at some point, I thought “Well, I really feel like writing a book now.” So, I asked the other two, “Are you comfortable if I just go ahead and do the book on my own?” And they said, “Yeah, please do.” So, that’s how Safety Myth 101 was born and after that it’s hard to stop, I guess.

Stephenie Langston:

Well, the ideas just started flowing and I know you covered a lot of material in Safety Myth 101, things that I, in my career, have heard time and time again. I’ve only been in the safety industry eight or so years (and limited to academic safety), and so it was eye-opening and refreshing to see things that didn’t necessarily make sense to me like, “all accidents are preventable”—those kinds of conversations, even just the buzzwords that people use. I saw a lot of the one that I think is helpful but can also be frustrating at times: “you need to have a great safety culture.” How do we define that? What does that actually mean? What are we putting behind that practice to get to where we need to be? Because in business culture, safety is just one part of it. I could probably go on a tangent for a while about that one.

Carsten Busch: I’ll be pretty fair to join you on that.

Stephenie Langston:

Yeah, and so it’s great to have what somebody would call a great safety culture, but I think the motivation to get there sometimes is lacking, just like when somebody in business says, “We want to have a great business culture.” Okay. Well, what are you prepared to do to actually make that happen? It requires work, effort, change in thoughts, and change in how you approach things, which I appreciated in your book.

Carsten Busch: Thanks.

Stephenie Langston:

Before I get too far off topic, today we’re going to focus mostly on demystifying risk and maybe touch, like I said, on a myth or two here, which I just did. When we first spoke about this conversation, I wanted to gain your insight as to why identifying and understanding risk is so difficult for safety professionals, and really understand what you see as the biggest obstacle for professionals (even seasoned ones) when evaluating risk.

Carsten Busch:

I think one obstacle for professionals is the one that I jotted down when I prepared myself for this conversation: we are often trapped in patterns. Because you’ve learned…One thing I do recall when I started the safety bachelor’s degree in the Netherlands, (while I was working) what we were taught were a few methods and that’s it. And then, you go to work and you deliver to your customers, to your wealth management that need something, a risk evaluation for a new workplace or just the overall risk and so on, or they want to do a change, and you grab the tools that you’ve been taught. And at some point, you will find out that these tools don’t really work in situations where we try to apply them. I think safety professionals often do get decent bases, but we need to develop and find all the ways in addition and maybe play a bit with the methods. Just as an example, in the past few years I’ve advised on a couple of rather big reorganizations, and we said to the staff, “We need to look at risk with these reorganizations, because it affects the health, sometimes safety, and the welfare of the people.” There’s a lot of uncertainty in “What’s happening to us?” and job security, etc., and that affects people. Also, we are reorganizing while we try to run business as usual, because that’s what society expects. We can’t shut down the police and say, “We’re reorganizing. We’ll come in three months’ time, when we’re finished.” So, there’s also a lot of workload. And then, people thought, “They’re saying sensible things. Let’s do risk assessments.” And then [the risk assessments] started, and [with] severity and frequency or probability, it just doesn’t work. What’s going to be the consequence of fusing two departments and moving them to another place? It can be anything and “what’s the probability of [fill in the blank]” just doesn’t give meaning. And then, we tried other ways of approaching the risk or other problems (that’s part of demystifying also). Some people will think, “Well risk is severity times probability.” So, we said, “Why don’t we talk about problems instead?” Because that’s just another way of looking at risk, and besides, we have also positive science. Because we reorganized for a purpose, we want to achieve the positive side of risk as well, and let’s call that “opportunities.” And then, discuss problems and opportunities and discuss it with the people—what bothers them? So, you are going much more to the “soft side” of risk, which was much more useful and appropriate in this setting. So, that’s just one thing where we found out being trapped in what we usually do isn’t working, so let’s find another way of achieving the same goal.

Stephenie Langston:

I wonder, too, if you took and you had…In a chemistry lab, it makes more sense to do the risk and probability because you have a controlled experiment. You still have a human factor, but there’s less volatility, I think, then in human emotion. And so, I think learning as safety professionals…because I wholeheartedly agree that once you learn there’s these tried and true things that you just follow, and then to get people to step out of that box and look at things from a different angle is often difficult. I agree that when you’re approaching a risk assessment, [one thing to consider is:] are you evaluating emotions and things that are going to be extremely volatile or are you going to do something that’s more of a controlled environment? So, it’s really interesting to hear how looking at the problems and opportunities in this instance (which makes complete sense), really gave you a little bit more insight into how reorganization would affect people in general (as a whole), and either make the reorganization successful or a failure. Because, ultimately, if you do have safety issues and risk issues, then it’s going to fail, and/or just make people’s lives really difficult.

Carsten Busch:

Another factor in this case was that people were not only stuck in the patterns, but in a way the documentation, the safety management system, wasn’t helpful either, because that was describing the traditional method with risk matrixes and so on. And people would say, “We need to do a risk assessment with these tools, because that’s how we do it, that’s how we should do it because we have to document.” So, we said, “We are the bosses here.” So, we just changed the forms and we made the forms really open and discussion-based and just four simple columns:

  1. What are you going to do?
  2. What are the effects, positive or negative ones?
  3. What does this mean?
  4. What can we do to fix the problems?

I think that worked really nice. It needs a bit of facilitating.

Stephenie Langston:

I think quite a few people in our audience probably come from really small environmental, health, and safety departments, maybe even some that don’t have upper management support. That’s key, if you’re going to do something like a reorganization, you likely have upper management support. And people working in a police station are more aware of risk and the conversations around risk, whereas in a business that might not be a priority (reorganization.) And so, with something like that I’d imagine that it was somewhat time consuming. So, other than going door-to-door (or office-to-office, person-to-person,) what are some tools that you think would help an army-of-one trying to approach this risk?

Carsten Busch:

In our specific case, we had the good fortune to be…Because it was a really big reorganization there was this project team. The project team was actually concerned about the risk, and I really helped facilitate the whole issue by making it a management responsibility. The manager of a unit was responsible for having a process, and he/she had to report on the matter, how they were fairing and whether they had action plans. I had involved people, so that helped a lot. It still didn’t work one hundred percent, of course, because some people will just go through the motions. They would rather fill out the forms then actually have the discussion. But in some places, we saw some wonderful processes, where people were involved in them, when I really went in and the process went as we had dreamt it. It’s those small victories that you have to celebrate.

Stephenie Langston:

Right, I was just about to say that. Risk assessments are great, but in my previous [career], I remember doing a lot of evaluation on novel compounds that people were using in animal research, and that’s a wild-card situation. You never know what you’re going to get sometimes. Having people not necessarily fill out the risk-versus-probability, but asking them questions about the drug, “Tell me how it’s used. What are you anticipating as a research professional? The metabolism, and things like that (which are all answers they should have for their research.)” We broke it down to those conversations and then asked them to evaluate “How much do you think is actually going to end up in your bedding?” Then, really working as a team effort to get to the end goal of “Is that cage safe for somebody to handle or is it not? And if it’s not, what kind of PPE [do we need]?” Not just doing a risk-versus-probability, but deep-diving a little bit more, working one-on-one. I do like the team approach, personally.

Carsten Busch:

I think many people do a lot of this stuff already, and then they are actually quite good at handling risk, without being aware of it. Because [many people] have this idea that risk assessment means that you fill out some kind of a table, and you play with these numbers, and then you get some color, and you put them in a risk matrix, and that’s a risk assessment, but now, it’s not that. It’s just thinking, “Well, what could happen? What should I do? What’s the effect?”

Stephenie Langston:

You mentioned in your book, too, we do risk assessments daily. As a parent, I do a risk assessment all the time. My son’s hiking and I can see him, and it’s part of your natural, innate ability as a human to do a risk assessment, and I think sometimes we overthink that as safety professionals.

Carsten Busch:

We overthink it and maybe we don’t use that the ability enough. We try to actually engage them and tell people [they] are doing most of this already, and for police officers, it’s ingrained in their work. They have actually a hard time distinguishing [whether they are] doing police work or taking care of their own safety and security. I think that’s good.

Stephenie Langston:

So, in your book, you mentioned that there are three types of safety. Do you mind elaborating on that a little bit more, and how do you think that plays a role in identifying and evaluating risk?

Carsten Busch:

Well, how you define safety or how you choose to represent your risk, it will affect how you assess and make a judgment over your risk, I think. If you take three methods from a measuring book, you can view safety as an outcome. In risk terms, I would think that it’s more or less experience, risk that you have been exposed to in a period of time. Like the number of fatalities on the roads is often presented as experienced risk by regulators. And then I do some judgment, whether risk has been lowered or has increased. So that’s one way of thinking about risk, and it puts a premium (more or less) on [that] risk can be judged looking backwards. Which is valid to a point, because the near future is probably pretty much like the near past, but it is limiting and (in a sense) it’s even wrong, because risk isn’t about the past, risk is about the future. Risk is about something that hasn’t happened yet, but may happen or may never happen. That’s the fun part of risk, because you don’t know what it is until it is, and then it isn’t any more, so to speak. That was very, very philosophic. So, you have the backward looking, but it’s limited. And the backward looking works best for systems which are very well understood and highly controlled (or very technical). Like light bulb manufacturers do these tests, switch on and off, and then most of these light bulbs will tolerate this period of stress, etc. Then acceptable risk is the other view on safety, and I think that’s the very appealing one, that you can calculate some kind of level where you want to be. It has some issues, too, because it’s limited by knowledge. You base your probability on something, but how good is that? How well understood are the effects? How well understood are the probabilities? How good is your knowledge base and all that? So, instead of a straight line, the line is 10-minus-4 (or something). Maybe you should rather talk about the range. Our risk is between 10-minus-3 and 10-minus-5-ish. So, there are variations and you have to account for them, and there is, of course (probably the biggest issue): who is going to decide on what is acceptable? Is it the regulator? Is it the people who benefit from the risk? Is it the people who are on the receiving end of the risk? So, if somebody would come and build a chemical factory next door here, I would probably not be on the benefit side, but my risk grows. So then, you find this is acceptable risk and the Environmental Agency thinks it’s great, but still, I have a factory next door, and if it goes “boom,” my house goes “boom.”

Stephenie Langston:

I can think of things like the fertilizer plant explosion in Waco, Texas, here in the U.S. I think people moving back into that area probably did the same thing, where they’re like, “Is this acceptable? Do I want to live this close to where there was this major accident, and people lost their homes?” My mind just went straight to that example of the devastation that regulators obviously found it fine, the company found it acceptable, but the homeowners, at the time may have found it acceptable, but going forward likely will not.

Carsten Busch:

There are a zillion cases like these, where…Who decides what’s acceptable? It’s a very appealing idea, I think, but it has a lot of limitations too. Which, I think we, as technical safety people, probably don’t see because we are very much in this mindset of, “We can calculate, and of course, there are uncertainties, and we can calculate those too.” And then we can say, on a theoretical basis, “This should be great.” And then, we are baffled by the fact that others don’t agree with us. It seems like they’re talking another language, and I do talk another language. That’s an interesting topic I’ve been reading up a bit on safety, here. That book is seeing safety or risk as following of rules, which I think is a very basic way of looking at risk and safety, and it’s also a very comfortable way, I think. That you more or less benchmark something with, “Well, this is how it should be done, and if I do it this way, then I control the risk.” It’s what we do at traffic lights. We stop at red, we drive at green, and most of the time that’s how we should behave, and then things go right. It’s how we teach our children in safety, because children aren’t able to risk assess yet. At least, I don’t think they understand the concepts yet as little children three, four years old. So, we teach them the rules. To come to the side of the road, and there’s this light, you stop at red, and my children were really good at it. We went shopping, and when people would cross at red, they would [say], “Ah! You should stop!” -Stuff like that, and as a parent, you will think, “Oh, my dear. What’s happening here, on the other side? Well, at least I taught them this.”

Stephenie Langston:

My son’s favorite saying is…he will go to the playground, and if somebody claims to high, he goes, “That’s not safe!”

Carsten Busch:

If you want to be right, play the safe card. People can’t say that they’re “not for safety.” That’s one way of looking at risk, but it’s limiting, because rules don’t work in all situations. We’ve probably all been there, we came to a red light in the middle of the night and there were no cars in the way, and inside [we think], “Should I drive, or should I wait and sit here?” Stuff like that.

Stephenie Langston:

Yeah, the ones where you’re sitting there and it’s that red arrow to turn left, and you’re like, “There’s nobody here!” and it’s the longest light. I think the regulatory side of things is a good basis. You’re right, it’s very basic. I’ve had similar conversations with other safety professionals, [about what happens] if we just go by the rules. I’ve known safety professionals who do that, they’re like, “This is the rule. We’re going to follow it. This is the only way forward.” It’s almost like you put blinders on and you can’t see the chaos that’s also happening all around you, and I think you miss a lot that way. I think if you can start to incorporate the good things…So, maybe start with the regulatory side of things and then incorporate the acceptability, what’s acceptable past the regulatory side. [Then], going forward and realizing that there’s those gray zones that you have to approach with a different mind’s eye.

Carsten Busch:

I think people should (to a larger degree) look where rules are really fitting and where they are counterproductive, and I think last year is a fantastic case for all this kind of stuff. I think my parents (who live down in the Netherlands) weren’t allowed to have my sister with her children over for a visit, because then they would receive five visitors, but my parents could drive to their place and visit them, because that family would only receive two visitors. What’s the difference? Who is receiving and who is being the guests? Rules are great, because they simplify stuff. We don’t have to [think], “What should we do? What is acceptable in the situation? Are we sure?” We just get a rule from whoever makes the rules, and then we follow them, and we think it’s good.

Stephenie Langston:

Maybe this is just ingrained in human DNA, and I know I’m guilty of this as well, I think sometimes we forget to question the rules at times. Like your evaluation of your parents going to your sister’s, and things like that. Sometimes, I think we forget to evaluate the rules and ask questions like, “Why is this in place?” maybe not to the grand scale of a pandemic, but even just within your own organization setting policies. You can set a policy, but is that serving a good purpose? It does take some time going back and saying, “We set this policy, let’s go back and reevaluate it.” Just like you go back and you reevaluate risk. Are your policies actually serving a purpose, OR (and I’m just speaking to the area that I worked in, but) are they hindering discovery, are they hindering creativity? Rules are great, but, at least for me, the reason I had a job in academia was because you had these brilliant individuals who are creative and were searching for discovery (almost [they] wanted to be Indiana Jones-like, as well.) If you create so many rules and regulations that hinder that, you’re out of a job. So are they, but you’re out of a job as well. So, what are some things that you can go back and reevaluate, and maybe leave some room to grow into?

Carsten Busch:

I actually have a nice example here of rules or guidelines that have negative side effects. This bottle here with water, the date on it says 31st of August, 2020. So, the “best before date” is almost a year ago, but it tastes perfectly fine. Here in Norway (I don’t know how it is in the United States), more and more there is this greater awareness that these dates are useful, but you can also smell and taste, and if it’s good, it’s good. So, it’s not the date that necessarily says something about consumption. The date is about the selling date, but I drink this water and it tastes perfectly fine.

Stephenie Langston:

It’s almost as if the “sell by day” or the “best by date” is what the company would assume as acceptable risk. So, they’ll assume acceptable risk for that water being good quality up until that date, and then after that it’s up to you (as the consumer) to decide, “Is it acceptable or is it not?”

Carsten Busch:

Who decides on the risk? I think in the companies it’s very much the lawyers who say, “We don’t want any cases, so we’ll do a date that we can guarantee.”

For more safety advice and information from Carsten visit: www.mindtherisk.com

Or purchase one of his books.

Your Complete, Cloud-Based Safety Solution

An online, integrated platform to protect your team,
reduce risk, and stay compliant

Contact Us