Kevin Hallas, Commercial Manager in HSE’s Centre for Human and Organisational Performance, illustrates how instinctual, rather than thoughtful, behaviour can undermine health and safety.
How long do you wait for temporary traffic lights to change? The longer you wait, the more you think they’re not working.
You can see the other end of the road works and there are no cars approaching. The light stays red. It brings to mind the Tragedy of Hope1 by Tim Key, an amusing poem about waiting for a friend, who shows up a year late to find just a skeleton at the rendezvous point.
The dilemma is worse when you’re cycling. You’re getting cold, but it takes longer to get through, what if a car comes the other way? The light stays red.
Say you ride through, get struck by a car; end up in casualty. The accident investigation concludes you rode past the traffic light at red. Clearly you are at fault. What sort of idiot rides past a red light? So you don’t ride through, you get really cold, shiver uncontrollably, collapse on the ground; end up in casualty. The accident investigation concludes you were stationary for an hour. What an idiot! Whether you go through or not, people will use hindsight to determine that you were crazy.
Now consider the scenario where you ride through the red light, survive intact, and it’s like it never happened. If you ride through and all is OK, you increase the likelihood that you’ll do the same thing next time. Familiarity with the situation tends to move our thinking process from our slow brain, to our fast brain; the action becomes more instinctual, rather than thoughtful. The greater the positive reinforcement of no bad outcome, the more comfortable we become with passing the red light and the less we’ll stop to think.
This behavioural change sounds like an issue that safety professionals should be concerned about. The more you get away with not following a procedure, the more familiar and comfortable you become and the less you think about what might go wrong. Your improvised procedure becomes the norm. Look at what happened when Charlie Morecraft2 took a few short cuts that he’d normalised at the Exxon refinery in New Jersey on the 8th August 1980. It blew up, spectacularly.
We know a lot about how experience, familiarity and biases affect our behaviour; our actions become instinctual reactions to the stimuli around us. We’re much better at spotting these effects in others than ourselves, so keep in mind that most of us are quite blind to our own biases. In safety, we’re vulnerable to hindsight bias too, rationalising a bad outcome as entirely expected given the evidence before us (after the fact). To get a proper understanding of the factors leading to an incident, you really need to be able to empathise with the people on the ground and consider what information they had with which to make their decisions.
So next time you’re looking at a workplace near miss or accident, consider the human factors involved. Ask yourself if your staff were set up to succeed? Were procedures usable, practical and fit for purpose? Had poor practices become instinctual? Did your staff have the information they needed to make safer decisions?
What are your thoughts on Kevin’s post? Please share them by leaving your comment below.
2 Out of the Ashes, Charlie Morecraft, 2012, ISBN 978-1455508471