close
HSL places cookies on your computer to improve our website. These cookies don't collect information that identifies a visitor and are all anonymous. They are used to measure its performance and to provide enhancements to you while using the site.
Enabling a Better Working World

Enabling a better Working World

Mobile Nav Mobile Basket

You are not logged in

Investigating behaviour

Accident investigations should pay more attention to human factors, especially behaviour.  "Over 90% of accidents may be attributed, at least in part, to the actions or omissions of people." 1

When it comes to examples we are spoiled for choice and unfortunately many of them are quite spectacular.  They include Three Mile Island (1979), The King's Cross Fire (1987), the Herald of Free Enterprise disaster, the Union Carbide disaster at Bhopal (1984), the space shuttle Challenger explosion (1986), Piper Alpha (1988) and Chernobyl (1986).  More recent examples include Buncefield (2005) and the BP Deepwater Horizon Gulf Oil Disaster (2010).

But although most accidents result from human failure, accident investigations do not probe these behavioural causes in anything like sufficient depth. This is a serious omission. It ignores vital evidence that could help prevent recurrence.

Of course investigations will look at those human failings leading directly to the accident but all too often that's it.  Content to put the cause down to human error and mete out a suitably proportionate punishment they will look no further. But in the words of the HSE publication, HSG 245 (Investigating accidents and incidents),2: "Investigations that conclude that operator error was the sole cause are rarely acceptable. Underpinning the 'human error' there will be a number of underlying causes that created the environment in which human errors were inevitable. For example inadequate training and supervision, poor equipment design, lack of management commitment, poor attitude to health and safety."

So why are these underlying behavioural causes so often neglected?  Identifying them could be invaluable.

"Behavioural aspects are often difficult to understand and investigate and the lead investigator simply may not have enough knowledge of human factors," says Malcolm Cope, an ergonomics and human factors senior scientist with the Health and Safety Laboratory (HSL).  "They are more likely to be somebody with an appreciation of the job, eg. an engineer working on the process, or a safety adviser employed by the organisation.  The temptation is to look at the accident in black and white terms and to focus on the immediate cause."

Nor does it help that behavioural factors are not always easy to measure. 

"Very often psychological factors underlie behaviours and these are difficult to measure in a validated way," says Mike Gray, a principal specialist inspector with the HSE.  Mike speaks from experience, having used his background in ergonomics and human factors to investigate many accidents.

Human failures

Human failures are grouped according to the immediacy of their consequences.  Active failures have immediate consequences and usually involve frontline workers. Examples would include an electrician getting a shock after failing to isolate high voltage equipment or an operator in a control room pressing the wrong buttons and causing a chemical escape. 

The second group of failures are termed latent failures.  These failures are made by those further back from the frontline.  Examples include designers failing to design safe equipment or job procedures, and managers failing to provide adequate training or enforce safety standards. 

Human failures are also classified by type.  The two main types are errors and violations and these are further sub-divided according to the details of their nature (Table 1.)

Errors are unintentional whereas violations are deliberate.  Examples of errors include misreading a display and as a result making a wrong decision which leads to an accident. 

Individual errors

Individual human errors should never be looked at in isolation.  Accident investigators must take the context into account.  HSE's Mike Gray gives an example.  "An operator standing at ground level was using a hand held control to operate an overhead gantry crane.  He pressed a button and the crane moved towards him.  In a panic he attempted to send it the other way.  It continued to move towards him. He was crushed by the metal load and hospitalised."

At first glance this may seem a simple case of carelessness. The operator pressed the wrong button with disastrous consequences. This type of human error is termed a 'slip', ie. a failure in carrying out the actions of a task.

Further investigation, however, revealed that the 'slip' was not just an example of carelessness. 

"It was a case of poor job design and poor equipment," explains Mike.  "The design, or lack of design, of the job made it all too easy for the operator to become confused. There were several cranes and not all of them reacted in the same way to the buttons on the hand held control.  For some of the cranes the left/right buttons worked the opposite way.  Where you stood relative to the crane could also made a difference, the button sometimes seeming to send the crane in an unexpected direction.  It might have helped had there been signs on the cranes clarifying how they responded to the controls." 

Whilst errors are genuine mistakes.  Violations, on the other hand, are deliberate.  An example would be a worker deliberately operating a machine without a guard, or a worker deliberately not wearing personal protective equipment in order to save time.

One case study3 describes how a worker lost his hand when he became entangled in packing machinery.  Eight machines were enclosed behind a fence to protect workers from coming into contact with them.  On the day of the accident the worker had entered the enclosure to clear a blockage while all the machines were still running. Normally opening the door to get into the enclosure would have switched off the machines but the worker had deliberately overridden the door's interlock.

On the face of it the worker had defiantly disobeyed the rules.  However, it was not as simple as that.  "Very few people are wilfully reckless," says Phoebe Smith, HSL's principal human factors specialist. "There are usually other reasons for it.  Often it is to get the job done quicker.  Supervision turning a blind eye simply rewards the behaviour because it saves time and money until, of course, an accident occurs."

Which is what happened here.  Opening the door to enter the enclosure would have switched off all eight machines.  Overriding the safety interlock on the other hand kept them running.  This reduced downtime and increased productivity even though it put workers at risk.  Management tacitly encouraged this behaviour by dropping in on the packaging area to discuss production targets.  Nobody condemned the practice of overriding the interlocks.

Defeating interlocks appears to be a widespread practice.  A recent report4 identified some of the main reasons for defeating interlocks on Computer Numerical Control (CNC) machines (eg. machines such as lathes that can be programmed to work automatically).

Several factors made defeating interlocks more likely.  Firstly, there were negative attitudes on the part of the workers about the need for interlocking.  These were coupled with the belief that they would not be injured because their experience and common sense would protect them.  These are termed predisposing factors).  Secondly, there was often poor machine design, lack of training and confusion about the legal requirements.  These are termed enabling factors.  Finally, there was a lack of visible management commitment to safety, a lack of enforcement or disciplinary action and a positive production benefit.  These are termed reinforcing factors.

The study found one of the most significant of these factors was poor machine design. Operators found guards to be impractical, hampering their ability to do the job.

Boredom and complacency are yet other factors to consider in general.  They can lead to human error or, in the case of the former, very high levels of frustration and violation. Ironically, attempts to make jobs safer have in some cases increased the levels of boredom.  One example is that of airline pilots.  Automatic pilots mean that long stretches of a flight can be taken with the pilots not actually flying the plane.  This means that they may not be mentally ready to take control in an emergency.  Also, the lack of the number of hours actually piloting the aircraft may leave them rusty.  Airlines now specify that pilots have to fly the plane for a fixed number of hours.

Dockworkers at Southampton docks rotate among three jobs; driving cranes, driving container carriers and working on the dockside and the ships.  This mitigates the complacency that could arise from doing just one job.    

Mike Gray lists other factors to consider when looking for the deeper causes of accidents:

"Although morale and complacency are difficult to measure they can influence behaviour to make accidents more likely.   Another important factor is the culture within the organisation, particularly the degree of management involvement and leadership.

Although management involvement and leadership are not directly measurable, some of their manifestations are.  For example, does higher management tour the organisation to solicit the views of the workforce on health and safety issues?  Are there policies outlining health and safety expectations and are these policies communicated and, if necessary, reinforced ? Our colleagues in HSL have developed a straight forward survey to measure safety culture and identify where the issues are."                                                                                                                                                                                                                                                                                                                                          

And, of course, poor training can lead to wrong behaviours, particularly in an emergency.  Mike Gray gives one example:  "There was a fire in a care home.  When the lights flashed on the fire alarm panel staff did not know what to do.  They wasted time trying to understand what the flashing lights meant instead of evacuating the patients.  The issue here was a lack of understanding of how the fire alarm worked.  This was a training failure."

Violations are often prompted by poorly written procedures.  HSL's Malcolm Cope comments: "They can be hidden away in impenetrable manuals whereas they should be integrated into the workplace." 

An increasingly popular way of doing this is using the concept of "nudge".  As its name suggests, this is a subtle intervention designed to nudge you.  And when used in a health and safety context the nudge is a nudge towards the right kind of behaviour.

A very simple example would be footprints on the floor to encourage you to use the correct walkway.  Another example could be to paint the shape of a tool on the wall just below the hook on which it should hang to encourage individuals to put tools back in their correct place.  Taken together these nudges could constitute a gentle shove towards a safer working environment.

These examples highlight just some of the individual, job and organisational factors that can influence the safety behaviour of employees.

Which are the most important?

In its publication, HSG 48, HSE is in no doubt. "Organisational factors have the greatest influence on individual and group behaviour".

Accident investigators, therefore, cannot ignore them. 

"Organisational factors set the context in which everything else happens," explains Mike Gray.  "One of the most important organisational factors of course, is leadership.  Leadership sets the expectations.  It sets the culture and it is hard for an individual to stand against a prevailing culture whether good or bad. If effective leadership is not there it can be difficult for those lower down to implement other changes because they are always working against the top.  When investigating an incident I principally look for things that people did not get right.  For example, were managers deliberately ignoring things being done wrongly?"

Clearly it is difficult to overstate the influence of health and safety leadership, whether it is the manager of Arsenal football club berating one of his players for smoking, or a managing director setting an example by wearing a hard hat on site.  Leaders set the expectations.

Care has to be taken, however, that this expectation does not get lost or mistranslated as it travels down the organisation. 

"In many organisations, managers at the top can be saying all the right things, but the message somehow gets lost or changed by the time it has filtered down to the bottom," says Phoebe Smith.  "For example, it can be transformed into subtle hints suggesting a production verses safety situation.  This can lead to unspoken pressure to take shortcuts." 

Accident investigations should therefore involve workers.  Not only will their expertise of the workplace and job help to determine the causes of the accident; they can also provide a comparison between safety culture at the top and the bottom of the organisation.

The management tours mentioned earlier can also ensure the safety message does stretch from the top to the bottom of the organisation.  They may also help create another powerful force for safety; peer group pressure.  Organisational culture has been described as the way we behave when no one is watching. That culture will be deemed a success when, for example, a worker on a night shift tells his colleague he should be wearing his safety spectacles.

And, of course, the degree of worker involvement is yet another indicator of an organisation's safety culture.  Had workers been involved in discussions about the overhead crane accident described on page??, job redesign may have prevented the accident.

Jane Hopkinson, HSL psychologist and co-author of the research report into why interlocks are overridden4) is currently working on HSL's Behaviour Change toolkit - ACT.  The toolkit emphasises worker involvement from the beginning. It stresses the importance of leadership, ensuring the environment is safe and that there is a good safety culture in place.  She sees worker involvement as one of the biggest weapons against cynicism. The toolkit is currently being evaluated and will be available to organisations from next March.

Handling human error

Finally, the way investigators handle investigation is of course a human and behavioural factor in its own right, especially where human error is concerned. 

Knee-jerk blame will almost certainly be counter-productive because it will undermine workers' trust and create an atmosphere of suspicion in which they will be reluctant to be open about what happened.  And once that trust has gone a vital source of knowledge will be lost to the organisation.

 

References

Reducing error and influencing behaviour, HSG 48 (second edition 1999),  Health and Safety Executive. Although, at 72 pages, this publication is weighty, it could be used as a comprehensive checklist for accident investigators looking to include human factors in their work.

 www.hse.gov.uk/pubns/books/hsg48.htm‎. 

Investigating accidents and incidents. A workbook for employers, unions, safety representatives and safety professionals, HSG 245 (2004, reprinted with amendments 2011). HSE

www.hse.gov.uk/pubns/books/hsg245.htm

Ergonomics and human factors at work, a brief guide, INDG90, HSE.  In effect a cut down version of HSG 48 for SME's.  This publication is aimed at employers and managers.

 www.hse.gov.uk/pubns/indg90.pdf

Identifying the human factors associated with the defeating of interlocks on Computer Numerical Control (CNC) machines. Research Report (RR) 974. Prepared by the Health and Safety Laboratory for HSE, 2013. HSL.

www.hse.gov.uk/research/rrhtm/rr974.htm

< Hiring science to investigate incidentsWellbeing Tree >

Back to the top