Good Safety but Bad Process
In this article Gary Rowe updates an earlier news article with a similar title.
Do you generally achieve good outcomes in your business even though you know your processes are far from ideal? If so, chances are you are over-relying upon your team to compensate for the gaps or deficiencies in the system.
Not unsurprisingly, we like it when our workers are flexible and help us achieve good results by working around our unclear systems, but we are also quick to blame workers when they make a mistake.
Error Judgement
We often hear “we’re only human”, and in the workplace we occasionally see incident reports conclude “human error” as the cause of some adverse event eg “he forgot to switch it off” or “she picked up the wrong box”.
In this article we show that human error is a starting point for investigations, not a conclusion or cause. Understanding this approach can rejuvenate incident investigations, and provide refreshing new insights into the underlying causes.
Error judgement is influenced by; cultural, religious, psychological and other factors. Have you ever seen someone over-react to a minor issue, but appear more relaxed about other “poor” behaviour?
It is likely the person had strong personal feelings about that subject and therefore reacted strongly to that particular issue.
Similarly, some people accept many risks in their life like; smoking, over-eating and driving fast, but panic if someone mentions “artificial stone benchtop” or “asbestos”.
Cultural Bias
Some overseas cultures place much stronger expectations on individuals to “look after themselves” and to “obey rules” eg spare the rod and spoil the child philosophy - per 17th century poem by S. Butler.
In Australia, our workplace safety laws hold the managers and company primarily responsible for health and safety, and allow much wider latitude for individuals, who may occasionally forget or choose not to follow some rules.
Many of these “harsh” or “soft” attitudes to individual responsibility are not right or wrong, but are derived from culture built up over decades or even hundreds of years, and under-pinned by personal beliefs from family, church, or schools they attended. Workplace culture is rarely as strong or as effective as that created by communities.
Hindsight Bias
Another important factor which often leads us to a conclusion of human error is “hindsight bias”.
Studies have shown that when investigators know the outcome, as they inevitably do, all other possibilities immediately extinguish leaving a simple pathway back to an individual and their now obvious mistake.
Blame Those Closest
Have you ever been accused of taking someone’s pen or bumping something off a table when you didn’t touch it? Chances are that you were blamed because you were the closest to the table at the time.
Similarly, in the workplace when a machine jams or a person is struck by a crane, we focus our attention on those in the immediate vicinity, and quickly conclude they made a mistake eg human error. However, stopping at this point will not identify the root causes, or prevent it happening again in the future.
Good Outcome – Bad Process
We tend to tolerate poor processes as long as they give good results. Therefore many failures have been deeply imbedded in the system of work and only become evident, if at all, after an accident.
Similarly, a procedural breach might be praised if it averts a disaster eg didn’t follow procedure in an emergency to deal with a previously unforseen circumstance, but a similar action (eg procedural breach) which results in damage or harm is often treated harshly.
Dealing with Human Error
So what should we do with knowledge of individual actions which appear to have contributed to an accident? Clearly, we cannot ignore the information, but we do need a better process if we want to get to the root causes and prevent similar events in the future.
Some of the key points to consider in future investigations include:
Complexity is the enemy of safety - simplify all your procedures, instructions and signage, particularly if unduly long or confusing.
We all know what is extremely unsafe or very safe - minimise “grey” areas eg ensure clear guidance (Go/ No Go criteria) for marginal circumstances.
Demands for quick incident investigations (as opposed to prompt incident reporting which is OK) may prevent gaining a deeper understanding – resist pressure to conclude investigations prematurely, but OK to provide interim reports.
Look for weaknesses in the system – strengthen the system (not punishment) to minimise future errors.
Reserve “punishment” for those who deliberately and repeatedly break our rules, or are grossly negligent.
Note: Article adapted by Gary Rowe from the principles contained in the book Behind Human Error by Sidney Dekker et al (2010).
For a copy of our 10 page checklist on how to get behind human error email Sarah at Safety Action, or call us on 03 8544 4300 if interested in including this information in your accident investigation training.