The scale of medical error

As surgeons, we are arguably practitioners of one of the most entitled, rewarded and rewarding occupations in the world. We are empowered to the completely legal action of putting a knife to work in a human body. Unfortunately … our patients are frequently caught in the ‘friendly fire’ of surgical care – health care providers causing unintentional harm when their only intent was to help.

The practice of healthcare in general, and surgery in particular, is a hazardous business. It was not until the 1990s, however, that the exact nature and scale of that hazard became apparent. In that decade, several high-profile cases of medical error were reported in the USA and in the UK, such as the enquiry into perioperative deaths from paediatric cardiac surgical care in Bristol Royal Infirmary, which threw the adverse effects of medical intervention into stark relief.

This prompted further investigation into the possible scale of harm perpetrated by modern healthcare. The Harvard Medical Practice Study in 1991 demonstrated a major adverse incident rate in 3.7% of all patients treated. Studies conducted in Utah and Colorado, Denmark, New Zealand and Canada reported remarkably similar adverse events rates of between 8% and 12% in all patients admitted to acute hospitals, and this range is now generally accepted as being typical of healthcare systems in developed countries. Crucially, throughout these studies about half of all these adverse events are deemed to be preventable.

Not surprisingly, these publications brought about a sharp focus on patient safety, which has now become, quite rightly, a permanent part of healthcare policy. Numerous changes have since been advocated to improve patient safety, including the mandating of minimum nurse-to-patient ratios, reducing working hours of trainee doctors, introduction of ‘care bundles’, the use of safety checklists and advances in the science of simulation and teamwork training.

Adverse events in surgery

The surgical domain can be seen as more complex and high risk in its delivery of care than other non-interventional specialities. It is therefore not surprising that in the majority of studies of adverse events in healthcare, at least 50% occurred within the surgical domain and the majority of these in the operating theatre. Furthermore, at least half of these adverse events were also deemed preventable.

Just as the multiple studies in the high-income countries have similar figures for adverse events in hospitalised patients across all specialities, there appears to be a similar rate of harm in surgery. A review of 14 studies, incorporating more than 16 000 surgical patients, quoted an adverse event occurring in 14.4% of surgical patients. This was not simply minor harm; a full 3.6% of these adverse events were fatal, 10% severe and 34% moderately harmful.

Gawande, a surgeon, policy and healthcare leader in Boston, made one of the first attempts to clarify the source of these adverse events. This paper pioneered the concept that the majority of adverse events were not due to lack of technical expertise or surgical skill on the part of the surgeon, finding instead that ‘systems factors’ were the main contributing factor in 86% of adverse events. The most common system factors quoted were related to the people involved and how they were functioning in their environment. Communication breakdown was a factor in 43% of incidents, individual cognitive factors (such as decision-making) were cited in 86%, with excessive workload, fatigue and the design or ergonomics of the environment also contributing. These findings were confirmed in the systematic review of surgical adverse events, where it was found that errors in what were described as ‘non-operative management’ were implicated in 8.3% of the study population versus only 2.5% contributed to by technical surgical error.

In accordance with other high-risk industries, such as commercial aviation, the majority of these adverse events are therefore not caused by failures of technical skill on the part of the individual surgeon, but rather lie within the wider healthcare team, environment and system. Lapses and errors in communication, teamworking, leadership, situational awareness or decision-making all feature highly in post-hoc analysis of surgical adverse events. This knowledge of error causation has been prominent and acknowledged in most other high-risk industries for many years, but it is only recently that healthcare has appreciated this.

Human factors

Enhancing clinical performance through an understanding of the effects of teamwork, tasks, equipment, workspace, culture and organisation on human behaviour and abilities and application of that knowledge in clinical settings. Ken Catchpole, https://chfg.org/.

Human factors encompass all the issues to do with humans working in a system—how we see, hear, think, behave and function physically as individuals and teams—which need to be considered to optimise performance and assure safety ( Fig. 9.1 ). The aim is to enhance performance by fitting the task to the human rather than vice versa, and maintaining wellbeing throughout careers. Human factors science extends to the design of tools, machines, systems, tasks and environments to ensure safe and effective human use. So a human factors perspective helps highlight why a piece of clinical equipment that has not taken human strengths and limitations into account in its design, or a work environment and shift pattern that is disruptive and stressful, is more likely to lead to less-than-optimal human performance and result in error and compromised patient safety. In surgery, these human factors issues range from the design of surgical instruments or operating theatres, to services and systems, as well as the working environment and working practices such as rotas, roles, team behaviours and so on.

Figure 9.1, Human factors in a healthcare system.

Surgical adverse events are almost invariably preceded by multiple contributing factors, often combining unsafe systems and unsafe behaviours. Unsafe systems (such as poorly managed operating lists) produce unsafe behaviours (such as disruption during swab counts). Equally, unsafe behaviours (such as disrespect towards junior staff) undermine safety processes (such as use of the World Health Organisation Safer Surgery checklist). Examples of poor systems and practices in the National Health Service include: widespread toleration of variation in standard procedures, such as surgical counts; operating lists with multiple changes in list order; failure to adhere to surgical site marking procedures; inadequate staffing; and absent or inadequate training, particularly in team working and human factors; all of which contribute to adverse events.

As evidenced from previous literature, it is apparent that the vast majority of adverse events and errors in surgery occur due to failures in these human factors domains, and can be traced back to poor work design rather than technical ineptitude on the part of the operating surgeon.

The Systems Engineering Initiative for Patient Safety (SEIPS) framework is one of the most widely used systems models for analysing the healthcare system and comprises three components: the work system, the care process and the patient outcome ( Fig. 9.2 ). The ‘work system’ encompasses five interconnected elements: individual, tasks, tools and technologies, physical environment and organisational conditions. These five interacting elements in turn influence care and other connected processes which in turn have an impact on outcomes. Although we focus a lot on patient outcomes in this chapter, the SEIPS model also focuses on staff outcomes including safety, health, satisfaction, stress and burnout; plus organisational outcomes including rates of turnover, injuries and illnesses, and organisational health.

  • (i)

    Individual: The individual at the centre of the work system could be any healthcare provider or team performing patient care related tasks or a patient receiving care or their family and support system. System design must take into account personal characteristics (including age, competence, preferences, ability to manage health information and wellbeing) as well as collective-level characteristics such as team cohesiveness or consistency of knowledge.

  • (ii)

    Tools and technologies: These are the items required by the caregiver to do their work. They can range from paper and pencil to computers and include information technology, medical devices as well as physical equipment. These factors can be characteristics such as usability, accessibility, familiarity and automation.

  • (iii)

    Tasks: These are those specific actions required by the caregiver to treat the patient e.g. documenting results, talking with patients and team interactions. Task factors can also be attributes of the task such as complexity, variety and ambiguity.

  • (iv)

    Organisational factors: Organisation refers to the structures external to a person that organise time, space, resources and activity. Within hospitals, organisation factors can be characteristics of work schedules, management systems, organisational culture, training and resource availability.

  • (v)

    Environment: Internal environment refers to the physical environment in which staff work and includes characteristics of lighting, physical layout and protection from hazards.

Figure 9.2, Human factors Systems Engineering Initiative for Patient Safety (SEIPS) model.

Many of these human factors can exist in a latent form within our environment, but they affect those of us working in the operating theatre at the ‘sharp end’, as it is the surgical teams who are the final common pathway for harm to occur to that patient ( Fig. 9.3 ). While in other high-risk industries, considerable work has been done by human factors scientists to engineer out many of these latent threats that lurk in our work systems, in healthcare we are still at a very rudimentary stage in this journey. At present in healthcare, we still need to rely predominantly on humans to act as our final line of defence to prevent harm coming to our patients, and to optimise our surgical outcomes. While technical skills undoubtedly play a part in this, it is equally important that we are able to deploy good non-technical skills to minimise the risk of harm to our patients.

Figure 9.3, The Swiss cheese model applied to a case of wrong site surgery. The slices of cheese are defensive layers within the healthcare system, the holes are transient or permanent gaps in these defences. When the holes align, a significant adverse event occurs.

Non-technical skills in surgery

Although there is undoubtedly a link between technical surgical skill and patient outcomes, it is clear that the majority of error and unintended harm arises from poor non-technical skills on the part of the surgical team, with loss of situation awareness (SA), poor decision-making, compounded by poor leadership, teamwork and suboptimal communication all implicated. Despite this, until recently there was no way of describing, categorising or rating these non-technical skills, and as a result they were not formally acknowledged, taught or assessed in surgeons.

A collaboration between the University of Aberdeen Industrial Psychology Research Centre and the Royal College of Surgeons of Edinburgh produced the Non-Technical Skills for Surgeons (NOTSS) taxonomy to describe and assess these non-technical skills in the intraoperative environment in the early 2000s. This taxonomy was designed specifically for use in the operating room and is currently the most widely validated tool for rating and classifying surgical non-technical skills.

The NOTSS classification ( Table 9.1 ) describes two cognitive non-technical skill categories—those of situational awareness and decision-making—and two social non-technical skill categories—teamworking and communication, and leadership. Each of the four categories can be subdivided into further three elements that describe each category in detail. This taxonomy can be used to rate and improve the non-technical skills of surgeons in the intraoperative environment—and they will now be considered in turn.

Table 9.1
Non-Technical Skills for Surgeons (NOTSS) skills taxonomy v1.2
Category Elements
Situation awareness Gathering information
Projecting information
Projecting and anticipating future state
Decision-making Considering options
Selecting and communicating option
Implementing and reviewing decisions
Communication and teamwork Exchanging information
Establishing a shared understanding
Coordinating team activities
Leadership Setting and maintaining standards
Supporting others
Coping with pressure

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here