For several decades now, there has been a preoccupation in the rail industry with human error and its causes. Human error has assumed an exaggerated significance which is often unhealthy and counterproductive. Because the rail industry has a strong conscience, and there are a wealth of safety professionals with a constant drive for improvement, the limitations of the human error approach have become increasingly apparent. It frequently blinds us to what is really going on in complex safety systems.

All too often, the approach after safety incidents is to identify the ‘bad apples’ and then discipline them for their errors, sometimes under the guise of ‘accountability’. However, very often so-called accountability is a one-sided affair, because it is largely frontline staff who are held accountable to save further embarrassment to more senior management. Not many would argue with the contention that safety is a shared responsibility. Unfortunately, this often fails to meaningfully translate into practical reality.

Errors are never made in a vacuum. They are usually indicative of wider patterns of safety behaviour which encompass whole safety management systems. The argument I’m making here is that the responsibility for safety normally tends to be shared asymmetrically, rather than being evenly distributed. This is clearly damaging for overall safety if management lessons fail to be learned.

So, it is high time we moved on to more fertile ground. It has been assumed by some human factors experts that if we can classify an error made by an individual, we will be better equipped to prevent similar errors in the future. The mere classification of human error has limited value, since it tells us practically nothing about the wider context particular individuals were part of at the time. For example, did the individuals involved receive the right briefing, training or operational feedback to limit the scope for making errors in the first place?

Unless we examine the whole system of safety implicated at the time of the error, we are likely to fall into the trap of pointing the finger of blame. We simply end up disciplining someone, or dismissing them. This usually means we end up none the wiser about system vulnerabilities. So we need to approach things from an altogether different angle.

For example, would a significant percentage of other reasonably competent individuals in the same adverse situation have made exactly the same error? What constitutes a significant percentage would need to be determined on a case-by-case basis. So in some cases, it could be lower than one per cent because an error would put lives at risks. In other words, we would then be talking about a low probability, but high impact event which must be prevented. If the answer to this question is ‘yes’ at the threshold set, the safety management system should pre-emptively make adjustments to prevent the error from being made by a reasonably competent individual.

Great strides have been made by the railway industry in attitudes towards individuals who are unfortunate enough to make safety critical errors in adverse conditions. Sometimes these errors have had catastrophic consequences. A relevant example here is the Purley train crash of 1989, which killed six and injured 80, where the driver’s original conviction of manslaughter was eventually overturned as being ‘unsafe’ in 2007. The particular infrastructure configuration at this location was deemed to be a critical factor in the accident, contributing to four other SPAD’s in the previous five years.

Other reasonably competent drivers would have made the same error in those circumstances. With the court’s decision, the role of human error was placed in its proper context, alongside other factors of equal significance. Despite this landmark ruling, there is still plenty of work to move attitudes towards a more compassionate view of human error.

The human-system deficit approach

For all the reasons discussed, I am arguing for the first time here that we need a brand new point of departure. The ‘human-system deficit’ approach I am advocating urges a complete rethinking of our attitude towards safety. The choice of words is new, highlighting the interface between humans and the system, and is important.

Do the words we use point the finger of blame – knowingly or not – at individuals when something goes wrong? If the words do, as is the case whenever we say ‘human error’, we need to steer ourselves towards a more accurate perception of what is really going on. Otherwise, we run the risk of making a judgment before being in full possession of the facts of an incident.

Naturally, it is about far more than just the language we use. The ‘human-system deficit’ approach can help restore a more moderate (and ultimately more enlightened) view of safety systems and how they are compromised under stress. The immediate assumption of human error can be suspended until there is actually evidence to suggest it is a critical factor. It may be. It may not be. But old, personal prejudices on human root causes need not colour our judgment from the outset, as they often do in incident investigation. Only when we avoid this trap can we objectively look for system deficits, viewing human error as one potential factor among many, rather than the sole cause of the system’s downfall.

Future safety gains in the railway industry, apart from those which rely on the introduction of new technology, will rest on re-balancing the parts played by individual and systemic factors. Thinking about safety incidents in human-system deficit terms is a step in the right direction.

Confidential reporting is an integral part of this new approach. In fact, the whole premise of confidential reporting is that it is the underlying safety issue which matters most, rather than the individuals involved. Our reports have never been about individuals, nor will they ever be. We’re interested in identifying vulnerabilities in company defences to help make them stronger. Many positive suggestions are put forward by our reporters, and then get implemented to increase the resilience of safety management systems. There can be few better examples of the human-system deficit approach in action.

Chris Langer is Human Factors Advisor at CIRAS

Visit www.ciras.org.uk