Fundamentals of Dynamic Decision Making in Anesthesia


This book is about decision making and crisis management in anesthesia. What is a crisis ? It is “a time of great danger or trouble whose outcome decides whether possible bad consequences will follow.” For our purposes, the time of great danger is typically a brief, intense event or sequence of events that offers a clear and present danger to the patient. Almost by definition, a crisis requires an active response to prevent injury to the patient; it is unlikely to resolve on its own. Of course, the best way to deal with a crisis is to prevent it from occurring in the first place. An old saying is that “it is easier to stay out of trouble than to get out of trouble.”

Skilled crisis management in anesthesia is no mystery. It demands that the anesthesia professional, while under stress and time pressure, optimally implements standard techniques of diagnosis and treatment for the patient. Medical knowledge and skills are essential components of the decisions and actions performed during crises, but they are not enough. To actually make things happen quickly and safely for patient management, the anesthesia professional must manage the entire situation , including the environment, the equipment, and the patient care team. These management skills include aspects of cognitive and social psychology and even sociology and anthropology. In this chapter we delineate the underlying conceptual foundations of patient safety, and in the next chapter we provide specific practical principles about crisis management. Chapter 3 reviews how to train clinicians to enact these principles, and Chapter 4 covers the art and science of debriefing about crisis management after real patient care events or after simulation scenarios. The remainder of this book (Catalog of Critical Events in Anesthesiology) offers specific recommendations for the recognition and management of a large variety of crisis situations.

Anesthesiology, by Its Nature, Involves Crises

Why is a book on medical crisis management addressed to anesthesia professionals (this term encompasses anesthesiologists, nurse anesthetists, and anesthesia assistants)? What makes anesthesiology and a few other medical domains (such as intensive care medicine, emergency medicine, obstetrics, neonatology, and surgery, to name a few) different from most other medical fields? The answer, to a large extent, is that the clinical environment of anesthesiology is dynamic, and this dynamism interacts very strongly with the complexity of the environment. The combination of complexity and dynamism makes crises much more likely to occur and more difficult to deal with. Thus the expert anesthesia professional must be skilled and therefore trained in crisis management. Following the work of Woods , and of Orasanu and Connolly, we address some of the aspects of anesthesia that make it a “complex, dynamic world,” namely, that it is event-driven and dynamic, complex and tightly coupled, uncertain, and risky (for the patient).

Event-Driven and Dynamic

The anesthetized patient’s state changes continuously. Unpredictable and dynamic events are frequent. The initiation of many events is beyond the anesthesia professional’s control, such as when the surgeon inadvertently transects a major vessel or when a patient with a previously unknown allergy suffers anaphylaxis.

Complex and Tightly Coupled

In technologic systems, complexity stems from a large number of interconnected components. The patient is the main “system” of interest to the anesthesia professional. Patients are intrinsically very complex, and they contain many components, the underlying functions of which are imperfectly understood. Unlike industrial or aviation systems, patients are not designed, built, or tested by humans, nor do they come with an operator’s manual.

Some physiologic systems are buffered from changes in others, whereas certain core components, such as oxygen (O 2 ) delivery and blood flow, are tightly coupled and interact strongly. , Anesthesia ablates some protective and compensatory physiologic mechanisms and will force the patient’s systems to become more connected. The patient’s physiology may also become tightly joined to external systems such as ventilators or infusions of hemodynamically active drugs.

Although the medical equipment connected to the patient is not as complex as that found in aircraft or spacecraft, it often consists of a proliferation of independent devices with multiple, nonstandardized interconnections. Devices are typically designed in isolation by engineers so that interactions between devices, or among the equipment, the patient, and the human operator, may not be adequately addressed in the design phase. These factors increase the complexity of the domain.

Uncertain

The patient as a system contains inherent uncertainties. The medical world knows very little about the underlying causes of specific physiologic events, although the general physiologic principles involved can be described. The true state of the patient cannot usually be measured directly but must be inferred from ambiguous patterns of clinical observations and data from electronic monitors. These data are imperfect because, unlike industrial systems (which are designed and built with sensors in key areas to measure the most important variables), separate, predominantly noninvasive methods are used to measure the variables that are easiest to monitor. Most physiologic functions are observed indirectly through weak signals available at the body surface and thus are prone to various sorts of electrical and mechanical interference. Even the invasive measurements are vulnerable to artifacts and uncertainties of interpretation.

Even if the anesthesia professional could know the exact patient state, the response of the patient to interventions is extremely variable. Even in “normal” patients, genetic or acquired differences in reflex sensitivity, pharmacokinetics, or pharmacodynamics can yield a wide range of responses to a given dose of a drug or to a routine action (e.g., laryngoscopy). In diseased or traumatized patients, or in the presence of acute abnormalities, these responses may be markedly abnormal, and patients may “overreact” or “underreact” to otherwise appropriate actions.

Risky

The decisions and actions taken by anesthesia professionals can determine the outcome for the patient. Even for elective surgery involving healthy patients, the risk of catastrophe is ever-present. Death, brain damage, or other permanent injury may be the end-result of many pathways that can begin with fairly innocuous triggering events. Each intervention, even if appropriate, is associated with side effects, some of which are themselves catastrophic. Furthermore, many risks cannot be anticipated or avoided. Unlike a commercial flight, which can be delayed or aborted if a problem occurs, immediate surgery may be necessary to treat a medical problem that is itself life-threatening. Balancing the risks of the anesthesia and surgery against the risk of the patient’s underlying diseases is often extremely difficult.

How Do Crises Arise?

A crisis is often perceived as sudden in onset and rapid in development, but, at least in retrospect, one can usually identify an evolution of the crisis from underlying triggering events. Figure 1-1 illustrates this process. In this model, underlying factors lead to specific triggering events, which initiate a problem. A problem is defined as an abnormal situation that requires the attention of the anesthesia professional but is unlikely, by itself, to harm the patient. Problems can then evolve and, if not detected and corrected by the anesthesia professional, they may lead to an adverse outcome for the patient. We consider this process in detail.

Figure 1-1, The process by which problems are triggered and then evolve during anesthesia. Interrupting this process can be accomplished by preventive measures or by dynamic detection and correction of the evolving event.

Problems Often Result from Latent Underlying Conditions

The events that trigger problems do not occur at random. They emerge from three sets of underlying conditions: (1) latent errors , (2) predisposing factors , and (3) psychological precursors .

Latent Errors

Latent errors, as described by Reason, are “…errors whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach the system’s defenses. [They are] most likely to be spawned by those whose activities are removed in both time and space from the direct control interface: designers, high-level decision makers, construction workers, managers, and maintenance personnel.”

Such latent errors exist in all complex systems. Reason describes them as “resident pathogens,” which, like microorganisms in the body, remain under control until sets of local circumstances “combine with these resident pathogens in subtle and often unlikely ways to thwart the system’s defenses and bring about its catastrophic breakdown” ( Fig. 1-2 ).

Figure 1-2, Reason’s model of accident causation. Accidents (adverse outcomes) require a combination of latent failures, psychological precursors, event triggers, and failures in several layers of the system’s “defense-in-depth.” This model is functionally equivalent to that shown in Figure 1-1 .

In anesthesia, latent errors can result from administrative decisions regarding scheduling of cases, assignment of personnel to staff them, and the priorities given to such things as rapid turnover between cases. They can also result from the design of anesthesia equipment and its user interfaces or how drug vials and ampules are designed and labeled or supplied to the anesthesiologist. Manufacturing defects and failures of routine maintenance are also latent errors.

Organizational Culture Factors

Safety in other industries of high intrinsic hazard is known to be a property primarily of systems rather than individuals. Organizations that perform successfully under very challenging conditions, with very low levels of failure, are termed “high reliability organizations” (HROs). The first HRO to be studied was the flight deck of aircraft carriers. Others include certain military organizations, commercial aviation, electric power grids, and firms handling large-scale electronic financial transactions. Based on direct observation of HROs, investigators have determined that a key element of high reliability is a “culture of safety” or a “safety climate” permeating the organization. Several features of safety culture or climate are as follows:

  • A commitment to safety is articulated at the highest levels of the organization and translated into shared values, beliefs, and behavioral norms throughout all organizational levels.

  • The organization provides the necessary resources, incentives, and rewards to allow this to occur.

  • Following standard operating procedures and safety rules is a part of the behavioral norms.

  • Safety is valued as the primary priority, even at the expense of “production” or “efficiency.” Personnel are rewarded for erring on the side of safety, even if they turn out to be wrong.

  • The organization proactively manages safety and carefully monitors ongoing safety processes and operating procedures.

  • Communication among workers and across organizational levels is frequent and candid.

  • Unsafe acts are rare, despite high levels of production.

  • There is openness about errors and problems; they are reported when they occur.

  • Organizational learning is valued; the response to a problem is focused on improving system performance.

Think for a moment how your organization compares to these safety ideals and where it could improve its performance. To the extent that a health care organization or work unit maintains a culture of safety, it can reduce the occurrence of latent errors and bolster flexible defenses against the accident sequences that do get started. However, there are many challenges to a culture of safety, particularly the erosion of safety in the search for throughput and revenue. Such forces can lead to “production pressure” , —internal or external pressure on the anesthesia professional to keep the OR schedule moving along speedily, with few cancellations and minimal time between cases. When anesthesia professionals succumb to these pressures, they may fail to perform adequate preoperative evaluation and planning or neglect to conduct pre-use checks of equipment. Even when preoperative evaluation does take place, overt or covert pressure from surgeons (or others) to proceed with elective cases despite the existence of serious or uncontrolled medical problems can cause anesthesia professionals to do things that are unsafe.

In 1994 we conducted a survey of California anesthesiologists concerning their experience with production pressures. We found that 49% of respondents had witnessed an event in which patient safety was compromised owing to pressure on the anesthesiologist. Moreover, 32% reported strong to intense pressure from surgeons to proceed with a case they wished to cancel; 36% reported strong to intense internal pressure to “get along with surgeons”; and 45% reported strong pressures to avoid delaying cases. Significantly, 20% agreed with the statement, “If I cancel a case, I might jeopardize working with that surgeon at a later date.” The economic pressures are obvious.

Production pressure also leads to haste by the anesthesia professional, which is another psychological precursor to the commission of unsafe acts. In the survey, 20% of respondents answered “sometimes” to the statement, “I have altered my normal practices in order to speed the start of surgery,” while 4% answered “often” to this statement, and 20% of respondents rated pressure by surgeons to hasten anesthetic preparation or induction as strong or intense.

Comparable results were found in a survey of residents in anesthesiology. In a similar survey conducted by Johnson in 2001, such pressures and experiences were again found for anesthesiologists. Although the study has not been repeated in nearly 20 years, we think that production pressures have only increased in the interim.

We also have conducted surveys across all hospital employees in multiple institutions in studies involving tens of thousands of personnel, and have documented that production pressures exist throughout the hospital and are not unique to anesthesiology. , , Moreover, we found a threefold greater rate of responses indicative of a lack of safety culture for health care personnel (18%) than for naval aviators (6%) given matched questions concerning safety culture. , Thus health care institutions do not yet have as strong a culture of safety as they should.

Local Predisposing Factors and Psychological Precursors

The final set of underlying features consists of latent psychological precursors, which predispose the anesthesia professional or surgeon to commit an unsafe act that triggers a problem. The primary psychological precursors are traditionally referred to as performance-shaping factors , and include such elements as fatigue, boredom, illness, drugs (both prescription and recreational), and environmental factors such as noise and illumination. Factors of work culture in general, and safety culture in particular, are also important to consider. Different combinations of psychological factors are discussed in detail in a number of review articles and general strategies to deal with performance-shaping factors and safety culture are discussed in Chapter 2 .

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here