Sharing ideas on Risk, Human Performance, Teams and Leaders

Risky Business Events Offer Alternative Ideas on Safety

Published: Thu 20 Feb 2020
Professor James Reason is Professor Emeritus of Psychology at Manchester University and the paradox at the heart of patient safety.

The Paradox at the Heart of Patient Safety 

Professor James Reason is Professor Emeritus of Psychology at Manchester University and is widely recognised as the world leading expert on human error. In his forward to the excellent book called Patient Safety in Emergency Medicine, Jim explains that there is a paradox at the heart of the patient safety problem. Medical education, almost uniquely, is predicated on an assumption of trained perfectibility. After a long, arduous and expensive education, doctors are expected to get it right. But they are fallible human beings like the rest of us. For many of them error equates to incompetence or worse. Mistakes may be stigmatized or ignored rather than seen as chances for learning. The other part of the paradox is that healthcare, by its very nature, is highly error provoking.

Understanding Human Error

Many other high risk industries have learnt that trained perfectibility does not guarantee a safe culture. Aviation accidents by their very nature receive instant press attention with pictures of charred hulls featuring on newspaper front covers and TV news channels within minutes of the incident’s occurrence. It was such images that ignited aviation regulator’s attention some 30 or 40 years ago. Accidents were being tagged as being caused by ‘human error’ or ‘pilot error’. The authorities finally decided that such a status quo was unsustainable and thus research into understanding human error began. 

Hi-fidelity Simulation

Aviation is very fortunate in being able to employ (such) hi-fidelity simulation to aid research into crew behaviour in crisis situations. It is for that reason that a great deal of the development of human factors understanding has emanated from aviation. Whilst it would be unwise to attempt to replicate the training of human factors, aviation style, into the medical or surgical setting, it does seem sensible to use some of the principles to short circuit the arduous path that aviation has followed in this regard.

Improving the Safety of Patients

There is compelling evidence that further improvements in surgical results depend on professional leadership, technical refinement and the application of scientific evidence about human performance. Current research demonstrates that factors known to affect the performance of surgical teams are similar to those in other high risk-industries. Many of these industries have a long history of safety investigation and improvement. Healthcare is a relatively late starter.

In 2006 the Chief Medical Officer (CMO) reported in his review  Good Doctors, Safer Patients:

‘It is only relatively recently that attention has been focused on patient safety as an issue. Despite the relatively high level of risk associated with healthcare – roughly one in ten patients admitted to hospital in developed countries suffers some form of medical error – systematic attempts to improve safety and the transformations in culture, attitude, leadership and working practices necessary to drive that improvement are at an early stage’.

An Organisation with a Memory

In the same document reference is made to the Department of Health’s publication from 2000 - An Organisation with a Memory - which highlighted failure to learn systematically from things that go wrong. This is a marked contrast to other high-risk industries. The report demonstrated the importance of improved and unified mechanisms for detecting safety problems and the value of a more open culture. It also highlighted the merit of a systems approach to preventing, analysing and learning from adverse events.

Comparisons with Other Industries

The most frequently compared high risk industry is naturally aviation as for several decades controlled study of expert performance has been carried out under valid conditions, in both normal and emergency operations.  Unlike aviation, medicine is a natural science, making it both more capricious and more idiosyncratic than aeronautical engineering. The risks and complexity of aviation differ substantially from the unlimited conditions faced by medical teams and their patients. Importantly the training, assessment and regulation are completely different.

Professional Leadership

Striking similarities in performance and outcome, particularly in the non-technical skills of experts however has been shown in the research into individual and team performance There is now an unprecedented opportunity for surgical teams to learn about the science, and to show improvement. Professional leadership is vital in this important area of risk management; one that has increasingly become known as ‘human factors’. This professional leadership can only be cultivated and deployed when supported by an appropriate training programme, effective regulation and efficient management systems. In the NHS these critical components are deficient and it is to be expected that patients will continue to suffer avoidable harm until they are improved. We now know that merely exhorting professionals to try harder does not work.

Aviation Training

In aviation, training was developed in response to new insights into the causes of aircraft accidents following the introduction of flight recorders and cockpit voice recorders into modern jet aircraft.  Information from these has shown that few accidents result from the technical malfunction of the aircraft or its systems, from a failure of the handling skills or a lack of technical knowledge in the crew. In over 70% of fatal events the failures are non-technical  a situation, which is exactly matched by the 70% incidence of communication failure in adverse medical events.

 

Author: Guy HIrst

 

Photo by Olga Guryanova on Unsplash