Safety and outcome in pediatric anesthesia


Introduction

Patients engage the healthcare system seeking help—help managing pain, help regaining function, and help restoring some level of homeostasis in their lives. Unfortunately, for many patients, the system designed to help instead introduces new risks, new ailments, and new disabilities. The ability of the healthcare system to both heal and harm is one of the central conflicts for providers, patients, and policy makers alike. Patients increasingly bring their own host of risks, comorbidities, and assumptions to the table. Healthcare—now more than ever—is a complex ballet of providers, incentives, and technologies. The intersection of complex patients and complex systems creates abundant opportunities for unintended outcomes. Patient safety is the study of how to design complex systems to ensure healthcare upholds the 2000-year-old promise to “first, do no harm.” Anesthesiology has proudly identified as a long-standing leader in promoting patient safety. As a means to an end rather than an end unto itself, the specialty of Anesthesiology has an undeniable moral imperative to continue to engage in developing strategies to improve patient safety and eliminate preventable harm.

History of safety

Within 2 years of the first public demonstration of general anesthesia at the Ether Dome in Massachusetts, the first documented anesthesia-related death occurred in 1848 when a 15-year-old received chloroform for a toenail procedure ( ). The first 100 years of modern anesthesia focused on adding capabilities for increasing surgical complexity without any systematic effort to study its consequences. It was not until 1949 that Macintosh’s article in the British Journal of Anaesthesia , “Deaths Under Anaesthetics,” recited a litany of anesthesia horror stories ( ), concluding that all anesthetic deaths are preventable and that errors are the cause of all anesthetic deaths. The first large-scale study of anesthesia-related deaths published by Beecher and Todd in the Annals of Surgery in 1954 revealed a mortality rate of 1 out of every 1560 anesthetics ( ). Characterizing anesthesia-related deaths as a “public health problem,” the authors noted: “there were 2.4 times as many deaths each year attributable to anesthesia in the total population of the United States during the five years of this study as there were deaths attributed to poliomyelitis.” They also asserted considerable research funding was devoted to treating polio while “next-to-nothing . . . to overcome the hazards of anesthesia.” These and subsequent studies of anesthesia-related mortality in the 1960s and 1970s remained focused on individual errors as the cause of all mortality ( ; ; ; ; ).

In 1978 Cooper—a bioengineer by training—made the first attempt to study the causes of anesthetic mishaps employing the aviation strategy of “critical incident analysis” ( ). A series of interviews revealed a number of factors that were classified and analyzed for patterns. The most common incident was breathing circuit disconnection, typically occurring in the middle of a case. Interestingly, he also concluded that rather than increasing errors, hand-offs between providers more often led to the discovery of undetected problems. Rising malpractice premiums at the self-insured Harvard Medical School in the 1980s led to the first anesthesia monitoring standards that would later be adopted by the American Society of Anesthesiologists (ASA). In 1982 the 20/20 TV episode entitled, “The Deep Sleep: 6000 Will Die or Suffer Brain Damage” opened with the announcer stating: “If you are going to go into anesthesia, you are going on a long trip and you should not do it, if you can avoid it in any way.” Victims of anesthesia errors were interviewed, and anesthesiologists were described covering multiple operating rooms by “run[ning] quickly and pray[ing] a lot” ( ). The subsequent fallout led to the creation of the ASA’s Closed Claims Project that analyzed malpractice claims and the founding of the Anesthesia Patient Safety Foundation in 1985 ( ).

In 1999 the Institute of Medicine’s landmark study “To Err Is Human” estimated that between 44,000 and 98,000 Americans die each year from medical mistakes, which again put the national spotlight on patient safety ( ). Over the last 30 years new airway management devices, new monitoring modalities, and new medications have all helped to make anesthesia safer. More importantly, a cultural change within the specialty has enabled new standards, new reporting systems, and a focus on system and design solutions rather than individual blame. A meta-regression by Bainbridge et al. suggests that the risk of anesthesia-related mortality has decreased steadily over the past several decades ( Fig. 60.1 ) ( ).

Fig. 60.1, Meta-regression for Risk of Mortality Attributable to Anesthesia by Year.

Today the ASA manages the Anesthesia Quality Institute (AQI) that oversees the National Anesthesia Clinical Outcomes Registry (NACOR), the Anesthesia Incident Reporting System (AIRS), and the Pediatric Perioperative Cardiac Arrest Registry (POCA) under the Closed Claims Project. The focus of patient safety has evolved from individual providers to complex systems, from individual cases to big data, and from retrospective analyses to real-time data monitoring and prospective interventions. With the advent of increasingly integrated information systems, new opportunities are on the horizon to better understand the outcomes of patients undergoing anesthesia at a scale never before possible.

Theory and terminology

Making anesthesia safer requires an understanding of the many factors that lead to patient harm in complex systems. Comparisons to high-reliability industries, like aviation and nuclear power, are a popular way to try to better understand the challenges of anesthesia. Aviation in particular—with its stellar safety record and analogous pattern of take-offs and landings—frequently elicits comparison ( ). The components of a safety management system from the aviation industry include:

  • 1.

    Safety policy

  • 2.

    Senior management accountable for safety

  • 3.

    Hazard identification and risk management

  • 4.

    Organizational manual

  • 5.

    Trained and competent personnel

  • 6.

    Reporting system

  • 7.

    Compliance monitoring system

Aviation is not a perfect analogy: if airplanes were patients, they would be unimaginably complex and variable, they would routinely fly with malfunctioning parts, and “bad weather” would not cancel flights. However, anesthesia has borrowed many useful tools from the aviation industry, including safety checklists, simulators, and work hour rules. As airplanes became increasingly complex in the early 20th century, the emphasis shifted from finding pilots with the “right stuff” to understanding the complex interactions between humans and machines. Over the last 50 years, the focus of aviation accidents moved away from individual error to examine cockpit design (human factors engineering), system complexity (complex adaptive systems), and teamwork and communication errors (crew resource management). Most importantly, the emphasis on safety culture encouraged nonpunitive reporting systems that were able to capture “near-miss” events that provided much more reliable data for making systems safer.

Terminology

Although industry analogies have provided valuable tools, they have also created some confusion about overlapping terminology, like the difference between harm and injury and errors and mistakes. Patient safety operates at the intersection of errors and adverse events.

Errors are actions or processes that deviate from provider intentions or accepted standards of care. Unintended actions are traditionally known as mistakes and something providers universally try to avoid. There may be a variety of individual and systemic reasons for the error, but the discrepancy between the intended action and the actual one is usually clear and can be treated like a stable engineering problem: ensuring the intended action happens more reliably. The second form of error—the deviation from the care standard—is more dynamic, as the correct action or process is less clearly defined and may change over time. Typically, the “generally accepted performance standard” (GAPS) is used as the metric, and the reasons for deviation from this standard may be more complex because of gaps in knowledge or resources, and some resulting from improper planning or consideration tread into the realm of negligence.

In addition to their origins, errors can be categorized by their consequences. Some errors reach the patient; others do not. Errors that prove inconsequential for patients are known as “near misses.” Although they do not directly relate to clinical outcomes, they are valuable indicators of malfunctioning systems and are sometimes referred to as “pre-errors” for their potential to mitigate future problems. Of the errors that do reach the patient, some prove inconsequential, whereas others result in a variety of adverse events ( Fig. 60.2 ).

Fig. 60.2, Venn Diagram of Patient Safety Terminology.

Adverse events are any undesirable clinical outcomes. Some are preventable; some are not. Some are related to errors; some are not. All patients arrive in some form of disequilibrium, and some will emerge worse off than before. The adverse events that most interest patient safety are the ones caused by healthcare-related errors because if errors can be eliminated (i.e., all healthcare processes unfolded as intended or up to standard), then presumably these outcomes are preventable.

Adverse events can be further stratified by the degree of impact to the patient, which is typically known as harm. Harm can be further categorized by severity and duration. Harm may be minimal, moderate, or severe. Minimal harm may or may not require intervention. Moderate harm may prolong a hospitalization and require additional healthcare interventions to correct. Severe harm requires life-sustaining interventions, and the most severe result in permanent disability or death. Harm may be temporary or permanent. The type and degree of an error can certainly affect the patient’s outcome, but health systems are complex and nonlinear enough that there is not always a clear correlation between the severity of an error and the severity of a patient outcome. For example, if an anesthesiologist reaches for a vial of an opioid but accidentally grabs a muscle relaxant, the consequences can vary based on the context. If a bar-code scanning system catches the incorrect vial and it is not prepared, the event becomes a near-miss safety event. If the provider gives the medication to the patient but the patient is ventilated, the procedure lasts for a few hours, and the patient suffers no ill consequences, then it becomes a precursor safety event. If the patient is instead breathing spontaneously, becomes hypoxic, and suffers a cardiac arrest, then the same original error becomes a serious safety event ( Fig. 60.3 ).

Fig. 60.3, Safety Event Classification.

Safety, quality, and value

Quality is a more expansive goal than safety. In Crossing the Quality Chasm, the Institute of Medicine defined quality as “the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge” ( ). Safety was the first of six components necessary to achieve quality.

Components of quality healthcare ( ):

  • 1.

    Safe

  • 2.

    Effective

  • 3.

    Patient-centered

  • 4.

    Timely

  • 5.

    Efficient

  • 6.

    Equitable

Even if everything is done correctly (safety), it still may not be done well (quality). Safety is necessary, but not solely sufficient, to achieve quality.

In 1966 Donabedian laid the foundation for health quality research by outlining three primary components: (1) structure, how care is organized; (2) process, the components of care delivered; and (3) outcome, recovery, restoration of function, and survival ( ) ( Fig. 60.4 ). Today, structure is assessed by provider board certification and hospital accreditation, and increasingly the Centers for Medicare & Medicare Services (CMS) is publicly reporting both process and outcome measures. Early on, Donabedian identified the challenges in measuring quality based on medical records or direct observation of clinical encounters. He pushed for clear measurement standards that could be linked with patient outcomes, expanded the focus of quality research to beyond just the technical management of illness, and inspired the current focus on value-based payments and patient-centered outcomes.

Fig. 60.4, Donabedian Model for Quality of Care.

Creating a safety culture

The foundation of all safety efforts is to create a culture of safety. Without an underlying culture of safety, siloed efforts from different areas can struggle to gain traction within an institution. An effective culture of safety consists of four parts, the 4 Ts:

  • 1.

    Transparency: accountability, blameless

  • 2.

    Transition management: leadership, engagement, change management

  • 3.

    Teamwork: training, communication, hierarchy

  • 4.

    Technology: integration, decision support, human design

Transparency

Transparency is critical for gathering reliable, accurate, and comprehensive data that drive all safety efforts. Documenting errors is always a fraught enterprise. Fear of liability, embarrassment, and punitive action often discourages providers from reporting errors. Traditionally, the only errors that are reliably captured are the ones with obvious, severe outcomes. However, capturing the near-miss errors allows for more proactive efforts to prevent future mishaps. It behooves leadership to set an expectation of “no-fault” reporting in order to gather the data necessary to assess opportunities for improvements.

Self-reporting is the most common form of gathering error data; it is the simplest but usually the least complete. Certainly, creating a sense of urgency and removing barriers to reporting can help. Self-reports may identify trends but will never be complete. Forcing functions like requiring a report—whether positive or negative—before closing an anesthesia record can also help capture additional data. Third-party auditing will generate a more accurate account of errors, but it tends to be resource-intensive and not sustainable. Most often, some combination of the two: ongoing facilitated self-reporting with periodic audits help gather the most reliable data.

Transition management

Change is hard. Change is even harder in healthcare where the stakes are high and most training happens via apprenticeship. Transition management begins with leadership explaining and justifying the change to the organization. Champions are identified, outliers are supported, and progress is measured. There are a variety of models for change management, from John Kotter’s 8-Step Process for Leading Change popularized at Harvard Business School to W. Edwards Deming’s Plan-Do-Check-Act Cycle that was instrumental in the industrial transformation of postwar Japan ( ; ).

Kotter’s 8-Step Process for Leading Change ( ):

  • 1.

    Create a sense of urgency

  • 2.

    Build a guiding coalition

  • 3.

    Form a strategic vision and initiatives

  • 4.

    Enlist a volunteer army

  • 5.

    Enable action by removing barriers

  • 6.

    Generate short-term wins

  • 7.

    Sustain acceleration

  • 8.

    Institute change

What they share in common is an acknowledgment that change does not happen just because it is the rational choice to improve efficiency or safety. For an institution to change, the people who are the institution have to change themselves.

Teamwork and communication

Teamwork and communication are essential, given the large number of people required to care for each patient. Failures of teamwork or communication are also one of the most common causes of preventable harm ( ). The importance of flattening traditional hierarchies was a lesson learned in 1997 when the pilot of Korean Air Flight 801 ignored the protests of the copilot and crashed into the side of a mountain in Guam ( ). The operating room has traditionally been dominated by the surgeon, which is why the first item on the World Health Organization Surgical Safety Checklist is to introduce all team members ( ). This element is not a general courtesy, but an attempt to “activate” staff in order to make them more likely to speak up later in the procedure if they notice any anomalies ( Fig. 60.5 ).

Fig. 60.5, World Health Organization Surgical Safety Checklist, Revised 2009.

Face-to-face communication is important, but critical instances benefit from a formal structure. Provider hand-offs in particular are more consistent and reliable when given a defined structure. Hand-offs between providers—whether from one shift to the next or one unit to the next—are some of the highest risk times for patients. Information degrades with each hand-off, and the consequences can be so profound that The Joint Commission developed a National Patient Safety Goal in 2006 mandating the use of structured, standardized hand-offs ( ). One 2014 study found that hand-offs from the night shift to the morning team—increasingly a concern with new resident duty-hour restrictions—found that on-call trainees omitted 40% of clinically important issues during the morning hand-off and failed to document 86% of those issues in the medical record ( ). See Box 60.1 for an anesthesia hand-off checklist.

BOX 60.1
From Boat A. C., & Spaeth, J. P. (2013). Handoff checklists improve the reliability of patient handoffs in the operating room and postanesthesia care unit. Paediatric Anaesthesia, 23, 647–654.
Anesthesia Hand-Off Checklist

General demographics:

  • Age, weight

  • Allergies

  • Procedure

Medical history, family issues

IV access/invasive catheters

Anesthetic:

  • Type (gas, TIVA, sedation)

  • Airway

  • Medications (antibiotics, narcotics, acetaminophen)

  • Positioning

Input/output:

  • Crystalloid/colloid

  • EBP/UOP

Laboratory tests:

  • Type and screen/blood consent

Disposition:

  • IVU vs. PACU

  • Postoperative orders

  • Pain plan: regional, PCA

EBL/UOP, Estimated blood loss/urine output; IV, intravenous; PCA, patient-controlled analgesia; TIVA, total intravenous anesthesia.

Electronic communication is increasingly central to contemporary healthcare practice. Concerns about security and privacy have prevented many healthcare organizations from capitalizing on technologic leaps in the consumer arena like smartphones and multimedia messaging platforms. Pagers were invented around the discovery of penicillin, yet despite their numerous limitations, widespread use continues in the healthcare industry. Paging a provider without knowing if the message was received and waiting by a phone for them to respond is no longer an acceptable form of patient management. Seamless, reliable, multimedia tools with integration into electronic health records (EHRs) and call schedules are critical for managing the complex healthcare environment of today.

Technology

Technology is inescapable in 21st-century medicine. EHRs have provided unprecedented access to information, solved legibility problems, and helped quality improvement by providing clinicians with real-time feedback. Unfortunately, the integration of information technology into healthcare has been a fitful, fragmented, and frustrating enterprise. Lack of competition, regulatory bloat, and focus on billing and compliance rather than patient care have led to expensive and poorly designed choices for EHRs despite billions of dollars of investment. New terms like “note bloat” were invented to describe the clinical effects of electronic notes, and new roles like “medical scribe” have been invented to shield clinicians from the burden of the increased documentation ( ). Patients complain about lack of eye contact at clinic visits, and the RAND Corporation cites EHRs as a significant contributor of physician burnout ( ).

To improve existing EHRs, the information needs to be much more accessible. Data need to be searchable in a variety of ways. Institutions need to be able to share patient records directly, and patients need better access to their own records. Data need to be much easier to mine in order to facilitate ongoing quality improvement efforts. Clinicians and administrators ought to be able to make simple, real-time queries to look at clinical outcomes across a variety of dimensions with analytical tools like AdaptX (formerly MDmetrix) OR Advisor (AdaptX, Inc., Seattle, WA) ( ). Anesthesia records in particular should be sources of real-time feedback rather than just data repositories. Tools like the Smart Anesthesia Manager at the University of Washington can help remind providers to start the blood pressure cuff, give antibiotics on time, and adhere to clinical protocols for beta blockers and glucose control ( ). EHRs have been very successful at gathering information into a single place, but they still struggle to present that information in a timely, customizable, and user-friendly manner, let alone add a layer of intelligence on top of it.

Human interface design

The failures of human interface design in healthcare have had devastating consequences. The infamous radiation therapy machine, Therac-25, was a failure of both software and human interface design that delivered lethal doses of radiation to six patients in the 1980s ( ). In contrast, after identifying machine and breathing circuit issues as the primary source of anesthesia-related mortality in the 1980s, the modern anesthesia machine is an example of excellent safety design. It features multiple fail-safes designed to make it very difficult to (1) deliver a hypoxic mixture of gas, (2) connect gas hoses incorrectly, (3) fill vaporizers with the wrong volatile agent, or (4) overpressurize the breathing circuit. As a result, the gases—whether from an oxygen hose, a nitrous tank, or an anesthetic vaporizer—are very difficult to administer incorrectly.

Unfortunately, intravenous medications do not enjoy similar safeguards. Medication vials appear in a variety of look-alike and sound-alike preparations and have few industry standards for their appearance. Syringes are easy to swap in the fast-paced operating room environment, nothing prevents intravenous medications from being given via an alternative route, and most dose calculations are done in providers’ heads.

ASTM International endorses a medication class–specific color-coding system designed to make the higher-risk interclass swaps less likely than the presumed lower-risk intraclass swaps. However, even foundational principles like whether to use color-coded syringes at all are still being debated ( ). Vials, on the other hand, are all color-coded, but the colors are driven purely by marketing rather than a more rational, safety-related system.

Labels in general are inherently treacherous because they (1) physically prevent nothing; (2) lack standardization; and (3) require intense focus and attention, which are in short supply in the operating room. Many a root cause analysis (RCA) implores providers to “read the label more carefully,” but that is like asking providers to look at the gas hoses more closely instead of implementing the pin-index safety system.

The human mind is designed to make assumptions, use shortcuts, and fill in blanks. For example, with the “filling in” optical illusion, the human visual system creates a square below where none exists ( Fig. 60.6 ).

Fig. 60.6, “Filling-In” Optical Illusion Creates a Square Where There is None.

Rather than removing visual cues, it is better to provide multiple, even multisensory, ones ( ; ). The oxygen knob on many anesthesia machines is not only a different color from air and nitrous oxide but often has a different texture as well ( Fig. 60.7 ).

Fig. 60.7, Flowmeter Control Knobs on Draeger Fabius Tiro Anesthesia Machine.

The Anesthesia Medication Template—designed at the University of Washington—uses size, shape, location, elevation, and color to help discriminate syringes ( ). Bar-coding systems like the one developed at the University of Auckland use both visual and auditory feedback to identify medication vials ( ) ( Fig. 60.8 ).

Fig. 60.8, Anesthesia Medication Template.

Physical countermeasures are much more robust than any form of provider feedback, which is why the Global Enteral Device Association is trying to make a new standard for the ubiquitous Luer interconnections so that enteral medications cannot inadvertently be given intravenously ( ). Similar efforts are hoping to make neuraxial medication, respiratory connections, and blood pressure cuffs similarly incompatible with intravenous lines ( Fig. 60.9 ).

Fig. 60.9, ENFit Patient Access Enteral Connector.

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here