In: Operations Management
How do you explain the differences between the airlines and healthcare industries (4Ps article)?
Safety in aviation has often been compared with safety in healthcare. Following a recent article in this journal, the UK government set up an Independent Patient Safety Investigation Service, to emulate a similar well-established body in aviation. On the basis of a detailed review of relevant publications that examine patient safety in the context of aviation practice, we have drawn up a table of comparative features and a conceptual framework for patient safety. Convergence and divergence of safety-related behaviours across aviation and healthcare were derived and documented. Key safety-related domains that emerged included Checklists, Training, Crew Resource Management, Sterile Cockpit, Investigation and Reporting of Incidents and Organisational Culture. We conclude that whilst healthcare has much to learn from aviation in certain key domains, the transfer of lessons from aviation to healthcare needs to be nuanced, with the specific characteristics and needs of healthcare borne in mind. On the basis of this review, it is recommended that healthcare should emulate aviation in its resourcing of staff who specialise in human factors and related psychological aspects of patient safety and staff wellbeing. Professional and post-qualification staff training could specifically include Cognitive Bias Avoidance Training, as this appears to play a key part in many errors relating to patient safety and staff wellbeing.
Comparisons have often been made between safety management in aviation and healthcare.This emulation is in the context of major achievements in the field of aviation – despite the number of worldwide flight hours doubling over the past 20 years (from approximately 25 million in 1993 to 54 million in 2013), the number of fatalities has fallen from approximately 450 to 250 per year.This stands in comparison to healthcare, where in the USA alone there are an estimated 200,000 preventable medical deaths every year, which amounts to the equivalent of almost three fatal airline crashes per day. As the renowned airline pilot Chesley Sullenberger noted,if such a level of fatalities was to happen in aviation, airlines would stop flying, airports would close, there would be congressional hearings and there would be a presidential commission. No one would be allowed to fly until the problem had been solved.
In this article, we present a comprehensive review of differences between aviation and healthcare and the application to healthcare of lessons learned in aviation.
Differences between the airlines and healthcare industries
Latent factors and organisational culture
At least three safety-related cultural attributes appear to distinguish aviation from healthcare. Aviation has much more of a blame-free culture in the case of reporting and owning up to safety incidents; in healthcare, there more often appear to be competing demands between economic factors and safety, with financial pressures and considerations constantly making news headlines; and safety permeates all levels of the business of airlines, whereas in healthcare it is still regarded as the priority of some, not the obligation of all. What is common to both industries is the concept of professionalism, but paradoxically this may sometimes lend itself to corners being cut and social fragmentation between professional groups.
A safety culture toolkit developed in the UK after railway accidents identified the following key features – leadership, two-way communication, employee involvement, learning culture and attitude towards blame.It is widely accepted that along these dimensions the organisational culture in aviation has changed dramatically over the past 30–40 years, but in healthcare organisations such as the NHS in the UK there is still the feeling that hierarchies and fear of speaking out persist and that the lack of accountability for those who have transgressed, together with the absence of any apology, perpetuates these cultural limitations.Sullenberger has referred to an era in aviation where pilots ‘acted like gods with a little “g” and cowboys with a capital “C”’. Sadly, some of this culture would still appear to remain in parts of healthcare. As Timmons et al. have argued, full and successful implementation of human factors initiatives may be stalled if the culture in an organisation is not accommodating. They found that a six-day human factors training course taken by emergency and perioperative staff appeared to be valued and considered helpful by staff who took part, but that implementation of behavioural changes on the ground was stalled by long-standing cultural and organisational issues. Sullenberger has powerfully argued for patient safety to be embedded in board and financial decision making in healthcare – and noted,
Safety should be part and parcel of everything we do … Every decision that is made, whether it’s administrative, budgetary, or otherwise, should take safety implications into account because there is such an important business case for doing so … What we have right now, quite frankly, in healthcare are islands – visible islands of excellence in a sea of invisible failures, with risk lurking just below the waterline. We need to widen those islands of excellence. We need to connect these islands with more dry land. We need to address these areas of risk. That is going to require transparency, it’s going to require data, it’s going to require personal story telling, and it’s going to require effective use of health IT.
Implicit in healthcare comparisons with other safety-critical industries is the message that staff wellbeing, morale and motivation are key to the safe, successful and profitable delivery of a service and of a supportive organisational culture. As Paul O’Neill, former US Treasury Secretary and CEO of the metal company Alcoa, stated, ‘I don’t think you can be habitually excellent at everything unless you begin with caring about your workers’.Staff may suffer distress and ill-health for a variety of reasons, ranging from distress following major complications of a treatment they have carried out to suicide in the context of undergoing investigations by a regulatory body.The Francis Report into whistleblowing in the NHS referred to many cases of whisteblowers and others being badly treated, and sometimes being subject to ‘kangaroo courts’ by NHS management, with no allowance for Plurality, Independence and Expertise principles to ensure fairness. Invariably, such cases may not only impinge on patient safety and staff wellbeing but may also involve significant expenditure from public funds coupled with financial hardship to staff who have to pay for their own legal costs. Legal settings, such as employment tribunals, are not interested in the implications of such cases for patient safety and staff wellbeing, and may sometimes be seen as weighted in favour of NHS employers, who have financial resources to maximise a legal case, to take an unfavourable ruling to a higher court, etc. In recent years, in the UK health service there have been prominent cases of NHS staff who have suffered as a result of extreme stress – including Eva Clark, the nurse at Mid-Staffordshire hospital, who committed suicide after being bullied at work and Jacintha Saldanha, who committed suicide in December 2012 after suffering the humiliation of mistakenly answering a hoax phone call, pretending to be from the Queen, to the ward where the Duchess of Cambridge was a patient.In both of these cases, the level of support that should have been provided to staff was apparently absent.
The Public Administration Select Committee of the UK House of Commons recommended that the government adopt the proposal set out by Macrae and Vincent for an independent Patient Safety Investigation Agency and this recommendation has been accepted by the government. When adverse events in healthcare seriously affect staff wellbeing, morale and motivation – regardless of whether the origins are poor patient outcome, poor management, etc. – such events need to be given the same urgency as when patients suffer. In line with the above message propounded by Paul O’Neill, it is worth considering whether, in addition to an Independent Patient Safety Investigation Service, a parallel body is put in place, an Independent Staff Investigation and Support Service, so that lessons can be learned when healthcare staff suffer in major ways in the clinical workplace, and so that staff support mechanisms can be readily put in place.The current UK Health Secretary is quoted as stating in June 2015, ‘The performance of the NHS is only as good as the support we give to the staff’
Active factors
Checklists
The need for checklists is based on the premise that in the execution of procedures the human brain may be subject to three key cognitive limitations: we may forget to retrieve one of a number of steps in a procedure; we may retrieve a step but for one reason or another (e.g. distraction, fatigue) may not remember to carry it out; or we may retrieve the step, remember to carry it out, but execute the action incorrectly. In aviation, there is usually much more in terms of procedural documentation of immediate relevance, such as in Airline Operations Manuals or Quick Reference Handbooks, and Toff has proposed the availability of similar systems in healthcare. In aviation, there appear to be three forms of checklists, one for simple, routine operations; one for more complex operations; and one for emergency procedures (where the checklist may be ‘do-verify later’ rather than ‘read-verify’). Checklists also vary between types of aircraft. Checklists have traditionally been a more integral part of aviation workflow, whereas in medical disciplines such as surgery, they have been a more recent innovation. To this extent, they may be seen to represent a form of ‘time out’ during an established routine. Medical applications of checklists have included the fields of surgery and infection control,and there have also been attempts to reap the benefits of checklists to help avoid errors in medical diagnosis.
Catchpole et al. used both aviation and Formula 1 pit-stop expertise to inform the use of checklists to ensure smooth handover between surgery and intensive care. Low et al.focused on the application of checklists on key transition points in surgery, ‘flow checklists’, so as to ensure that high-risk points such as departure from operating room do not suffer from lapses in procedures being executed. Wadhera et al.showed how such an approach, if applied to key stages of cardiovascular surgery with high cognitive demands, can yield benefits. In a similar vein, Federwisch et al.incorporated staff shift change-over times with a form of checklist by incoming and outgoing nurses to note items such as identification bracelet and IV catheter sites. Schelkun extended the checklist concept to implementing a form of aviation plan in surgical settings – plan the operation taking into account the patient, the injury/illness, and the goals of the operation; decide on details of the operation, noting surgical approach, equipment needed, etc.; put together a surgical equipment checklist; and ensure good communication at every stage of the procedure, including debriefing afterwards to review what went well and what could have been improved. On the more cautious side, Clay-Williams and Colligan argued that there is variable evidence on the efficacy of checklists in healthcare, that checklists may not be applicable in more complex clinical settings (cf.6), and that over-reliance on checklists may detract from other forms of safety. In a similar vein, Catchpole and Russ argued that a checklist is a ‘complex socio-technical intervention that requires attention to design, implementation and basic skills required for the task’, and that checklists may succeed and fail in healthcare for a variety of reasons.
Training
Training in aviation and training in fields such as surgery have been compared, with aviation training and competency assessment generally considered to be more rigorous and more regimented.Initial pilot training normally takes around three years, and becoming a captain will usually take around a further 10 years. Training to become a doctor usually takes around five years, with generally a further 10 years before becoming a consultant. Keeping up with the explosion of knowledge in healthcare is daunting but necessary, even for experienced consultants, but this is not so much the case in aviation. Pilot training is in a variety of settings, on the ground, in an aircraft and always in a simulator. Simulation has also been extended to teamwork and debriefing. Simulators are overall less used in medical training – or they are used less systematically. Pilots have to undergo proficiency checks, usually in a simulator, every six months. Doctors in the UK now undergo re-validation every five years. Pilot training is broken down into core competency skills, and this form of behavioural analysis of the skill training needs is becoming more common in healthcare. Non-technical skills, such as leadership, team working, decision making, situational awareness, managing stress and coping with fatigue, are extensively taught in pilot training, with well-established protocols for behavioural measurements of crew while in flight.It is only in recent years that behavioural marker systems that capture the non-technical skills of healthcare professionals have been developed in medicine, with some areas such as anaesthesia and surgery particularly embracing their value.When unexpected or emergency situations arise, both doctors and pilots will benefit from a commitment to life-long learning, a good understanding of disasters and how to deal with them and an ability to think flexibly.What is more, the personality of the pilot has been considered as part of determining risk-profiles during training, but as yet this has not happened in medicine.In surgery, Lewis et al. have argued that there may persist macho and ‘heroism’ personalities in surgeons, where improvising or finding a solution over-rides seeking or heeding advice from others in a team.
Crew resource management and sterile cockpit
Crew resource management essentially refers to how members of a team interact and are aware of factors that influence performance. Seager et al. noted five features of crew resource management – cooperation, leadership, workload management, situational awareness and decision making. The ‘team’ in aviation may primarily be just the pilot and co-pilot, with a degree of hierarchy between the two, whereas the team in surgery or other medical settings may be more diverse, with more distinct roles and with a variable degree of hierarchy. Communication failures may be more likely to occur in healthcare than in aviation cockpit settings for a variety of reasons, including the wide range of staff and distractions/interruptions that are prevalent in many clinical interactions. In healthcare, there is probably a wider range of information, with the reliability and dynamic nature of such information differing from that in aviation. In addition, the effects of introducing aviation-style teamwork training into medicine may vary according to the speciality,and may be determined in part by organisational and attitudinal factors.Although there are usually clear differences in knowledge, skills and experience between a pilot and co-pilot, safety in aviation is encouraged to take priority over deference, with simple measures such as the use of first names in interactions.This is not common practice in healthcare, since it is inherently hierarchical, with resultant barriers to assertiveness. As Ornato and Peberdy argued, some healthcare settings may well benefit from the implementation of aviation procedures such as cross-checks, read-back and ‘two challenge rule’ (another team member is allowed to over-ride someone if that person has been challenged twice but has failed to respond appropriately). Seager et al. have noted features of crew resource management which could be readily applied to healthcare settings such as the operating theatre, and these include peer monitoring, briefings, defining operating procedures and standards, recognition of fatigue as a factor in performance, regular ‘check rides’ in the form of assessment in a simulator, blame-free reporting culture, use of checklists and application of the principle of a ‘sterile cockpit’. Briefings before and after surgery may be particularly helpful in both encouraging members in the team to stand back and appraise procedures, and also to encourage mutual respect and team bonding between the members.Good communication within crew resource management involves respect for each other’s roles, and also simple measures such as direct eye contact, introducing each other, using non-judgemental words and putting safety before self-esteem.
A ‘sterile cockpit’, which essentially refers to an environment free of unnecessary distractions, may improve patient safety if applied at key points in clinical procedures.A distraction-free environment is especially important when a critical or complex procedure is being carried out, whether it be an intricate stage of a surgical procedure in healthcare or taking off/landing in aviation. There is a high frequency of distractions and interruptions in the work of healthcare professionals,with a negative impact on patient safety.A number of studies, such as that by Federwisch et al.,have successfully applied the sterile cockpit idea to medication delivery, where ‘DO NOT DISTURB’ tabards or signs are visible during medication rounds, so as to reduce the number of distractions. When emergencies arise in a cockpit or in a surgical setting, multiple alarms may be activated, and the ability to notice and respond to key alarms, and to think flexibly, are key for safe outcomes – analogies can readily be made here between airline and medical settings.
Performance analysis
Investigation of incidents
In the UK, an investigation report by the Air Accidents Investigation Board can involve at least several months of work, with field investigations where appropriate, and detailed background information sought on the equipment and individuals involved. The usual structure of an Air Accidents Investigation Board Report is as follows:
There is firstly a factual summary of the key features of the incident which includes detailed information about the aircraft and the pilot.
There follows a synopsis of the report:
An exposition of all the relevant facts of the incident, often with graphs and photographs
An analysis of the data gathered with a view to understanding what could have contributed to the incident
Conclusions and safety recommendations
Woloshynowych et al. have documented the types of investigations and analyses that are carried out for critical incidents and adverse events in healthcare, and studied 138 papers that provided relevant evidence. They cited systems such as the Australian Incident Monitoring System, the Critical Incident Technique, Significant Event Auditing, Root Cause Analysis, the Organizational Accident Causation Model and the Comparison with Standards approach. They concluded that:
There was little or no information on the training of investigators, how the data was extracted or any information on quality assurance for data collection and analysis … In most papers, there was little or no discussion of implementation of any changes as a result of the investigations.
Macrae and Vincent have pointed to major limitations in the quality of investigations and monitoring of the implementation of recommendations for improvement in the case of healthcare compared with other industries such aviation. They have argued for an independent investigations agency in the NHS, comparable to the Air Accidents Investigation Board, and to its parallel body in the USA, the National Transportation Safety Board, a recommendation that has been accepted by the UK government. In the USA, a specific aviation safety body was set up in 1998 to bring together stakeholders in government and industry, and was called the Commercial Aviation Safety Team. This team identifies top safety areas through analysis of accident and incident data; it charters joint teams of experts to develop methods to fully understand the chain of events leading to accidents; and it identifies and implements high-leverage safety interventions to reduce the fatality rate in these areas. Pronovost et al. argued for a similar body to be set up within healthcare.
Reporting of incidents
Reporting of incidents has many dimensions, which include the extent to which reporting is blame-free; the readiness to produce a report; the documentation of near-misses; the particular reports which are investigated; the format, investigation and dissemination of reports; the body that investigates and reports on serious incidents; positive or negative consequences for those who have contributed to or highlighted an adverse event; and the resulting action plans. In healthcare, Morbidity & Mortality meetings, where they happen, are often a forum where problematic cases are reported and discussed, and where deaths and serious complications ought to be reviewed to promote learning and improvements in practice. In terms of national reporting, in the UK there is the National Reporting and Learning Service, which is one of the largest reporting systems of its type in the world. A key criticism of reporting within healthcare is that the link from error to learning has often not materialised, and few mechanisms are put in place to ensure that changes have been implemented and errors are not repeated. In aviation, a major incident is often followed by the causes being simulated and becoming part of training, and particular equipment design, procedural or training recommendations being put in place, such as happened after the 2009 Air France plane disaster.
In clinical practice, adverse events such as complications are often considered to be routine, and thus may not be reported. Apart from blame, some doctors may not report near-miss adverse events due to a sense of pride or self-esteem, or due to fear of litigation. There may also be lack of time for reporting and high workload, lack of understanding why reporting is needed, concerns that no beneficial action will follow and in some countries lack of confidentiality or absence of adequate reporting systems in place. As has been found in aviation,near-misses may often be as instructive as adverse events.It may be worth translating into healthcare the aviation system of immunity from disciplinary action for the reporting of adverse incidents, apart from cases of gross or wilful negligence.The system used in aviation, Confidential Human Factors Incident Reporting Programme, has now been emulated in the field of surgery – Confidential Reporting System in Surgery – and has been found to work well.Similar schemes, which also encourage the reporting of near-misses, have adopted user-friendly online reporting formats.
Ferroli et al. described how, with the support of aviation specialists, they designed a Patient Incident Reporting System form which was used to record near-misses in a neurosurgical setting. They analysed 14 such incidents and were able to distinguish different types of failures – human factors (the most common), technological factors, organisational factors and procedural factors. Their reporting and analysis system appeared to encourage a no-blame reporting culture. Clinicians rarely keep an audio or video record of their interactions with patients, and the introduction of such recordings is a matter of debate.However, in aviation, ‘black boxes’ – which record flight data and cockpit conversations – are carried in all commercial aircraft. The idea of documenting all safety failures, however minor, was also highlighted by Bowermaster et al.,who likened their approach to that of using the ‘black box’ principle in aviation. Helmreich has described a ‘Line Operations Safety Audit’ that involved expert observers in the cockpit during normal flights. As well as potential safety threats, such as mountains and adverse weather, types of human error were documented, and fell into several groups – violations (e.g. conscious failure to adhere to procedures), procedural errors (e.g. erroneous entry into flight computer), errors in communication (e.g. misunderstood altitude clearance), lack of proficiency (e.g. inability to program computer-based device), and poor decisions (e.g. decision to navigate through adverse weather). There is scope for emulating aviation by including direct observation of clinical staff as part of routinely evaluating quality of care.
Implications for healthcare
There are many opportunities for safety measures and concepts in high-risk industries such as aviation to be considered for adoption in healthcare, with a need for actions to be proactive and generative, rather than solely reactive to adverse events.A focus on systems rather than individuals, and an examination of ‘latent risk factors’ that may result in adverse events, are other lessons that we can learn from aviation.Naturally, adopting measures from aviation without adapting them for the unique healthcare environment would be unwise, but where this has been done in a systematic but flexible way, clear benefits have been found.Issues such as privacy and patient confidentiality are particularly important in healthcare. In the finance-driven world of healthcare, any safety improvements should ideally have a good economic argument to accompany them, but – as Lewis et al. have argued – making such a case should be relatively easy to do, especially bearing in mind the huge litigation costs of clinical negligence claims.
As happens in safety-critical industries such as aviation, human factors training and related psychological training in patient safety and staff wellbeing need to be an integral part of all NHS staff work-plans, from the board-room to the bedside, with dedicated human factors/patient safety psychologists in post. Most major airlines have well-established departments that are staffed by a large team of psychologists/human factors specialists, while this is the exception rather than the rule for major NHS hospitals. The psychology of patient safety and staff wellbeing should be an integral part of the professional training curricula of healthcare staff, staff selection, induction, appraisal, revalidation, merit awards and Continuing Professional Development, so as to gradually develop the appreciation within the healthcare community of the impact of human factors, psychological variables and non-technical skills on safety. Cognitive Bias Avoidance Training could form a key component of such training curricula in view of the key part cognitive decision making plays in a number of adverse incidents,and the potential effectiveness of Cognitive Bias Avoidance Training for reducing diagnostic errors.Key bodies, such as NHS England, the Care Quality Commission and the Department of Health, as well as regulatory bodies such as the General Medical Council, should have resident expertise in human factors and the psychology of safety, together with an ethos that embraces and rewards clinical excellence.
In a recent television interview, Captain Chesley Sullenberger, the senior crew member of the Hudson River aircraft incident, is reported as stating,
We have purchased at great cost lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting those lessons and have to relearn them.