The evolving epidemiology of multiple organ failure


Multiple organ failure (MOF) has plagued surgical intensive care units (ICUs) for nearly 5 decades. MOF was first described as a syndrome of progressive organ failure leading to early death that most often occurred after sepsis (principally from intra-abdominal infections [IAIs]). With ongoing research, it was recognized that MOF could also occur soon after noninfectious insults (principally blunt trauma). Over the ensuing decades with fundamental advances in sepsis, trauma, and ICU care, the epidemiology of MOF and its perceived pathobiology have evolved through a series of predominant phenotypes into now a lingering chronic critical illness (CCI) leading to frailty, long-term disabilities, and indolent death. In 2012, University of Florida (UF) Sepsis Critical Illness Research Center (SCIRC) coined the term “persistent inflammation, immunosuppression, and catabolism syndrome” (PICS) to describe the underlying pathobiology of this new CCI MOF phenotype that is now commonly seen in surgical ICU survivors. This paradigm was proposed to provide a mechanistic framework in which to study CCI in surgical ICU patients who are now surviving previously lethal inflammatory insults (including trauma, sepsis, burns, pancreatitis, and complicated surgery). The purpose of this chapter is to describe (1) the epidemiology and proposed pathobiology of the evolving phenotypes of MOF, (2) the most recent PICS-CCI paradigm, and (3) the UF SCIRC ongoing efforts to validate the PICS-CCI.

Evolving epidemiology and proposed pathobiology of MOF

MOF emerged in the early 1970s as a result of newly organized ICUs with technology that allowed patients to survive single-organ failure. Over the ensuing decades, tremendous advances in trauma, sepsis, and perioperative critical care have progressively improved ICU mortality, and as a result MOF has evolved through a series of predominant clinical phenotypes. Recognition of these phenotypes spurred focused research efforts into understanding the underlying pathobiology and advance care. This interesting history is summarized in Figure 1 and will serve as an outline for the subsequent discussion.

FIGURE 1, Timeline of evolving epidemiology of multiple-organ failure (MOF). ACS, Abdominal compartment syndrome; ATLS, advanced trauma life support; CARS, compensatory anti-inflammatory response syndrome; CT, computed tomography scan; FAST, focused assessment with sonography for trauma; ICU, intensive care unit; IAI, intra-abdominal infection; PACs, pulmonary artery catheters; TPN, total parenteral nutrition; SIRS, systemic inflammatory response syndrome; SOPs, standard operating procedures; PICS-CCI, persistent, inflammation, immunosuppression and catabolism syndrome induced chronic critical illness.

Septic auto-cannibalism

Seminal reports in the mid to late 1970s described refractory MOF as a “fatal expression of uncontrolled infection” with an ICU mortality exceeding 80%. MOF was frequently associated with intra-IAI after penetrating trauma and emergency abdominal surgery. As a result, research attention focused on (1) earlier diagnosis of IAI with newly available computed tomography scans, (2) appropriately dosing new generations of antibiotics, (3) doing better operations, and (4) developing effective percutaneous interventional radiology drainage techniques. By the early 1980s with better understanding of the injury stress response, persistent hypermetabolism inducing acute protein metabolism in high-risk patients was recognized to be a major issue. This was linked to progressive immunosuppression, nosocomial infections, and worsening MOF in early survivors. Total parenteral nutrition (TPN) had become widely available in clinical care and became a focus of surgical research. The term “septic auto-cannibalism” was coined to explain the resulting tremendous losses of lean body mass seen in MOF patients. This occurred despite early use of standard TPN and provided the rationale for formulating new stress formula TPNs (enriched with arginine, glutamine, and branched-chained amino acids). Unfortunately, the widespread use of these stress formula TPNs with the goal of placing critically ill surgical patients into early positive caloric and nitrogen balance failed to improve outcome. A series of clinical trials in the 1980s convincingly showed that early enteral nutrition (EEN) reduced nosocomial infections compared to early TPN. This sparked interest in role of the gut as the “motor” of MOF and at the time bacterial translocation (BT) was promulgated to be a unifying explanation. While the rodent studies of BT were convincing, translational human studies seriously questioned the clinical relevance of BT as an early pathologic event in MOF. Subsequent studies indicated that EEN had beneficial effects on maintaining gut-associated mucosal immunity, which played an important role in enhancing systemic immunity and thereby reducing late nosocomial infections. It was concluded that BT is most likely a later pathologic event in MOF patients with ongoing gut dysfunction and disuse.

Sepsis syndrome

By the mid-1980s, reports from Europe showed that MOF frequently occurred after blunt trauma with no identifiable site of infection. It was recognized that both infectious and noninfectious insults (e.g., pancreatitis, ruptured abdominal aortic aneurysm) caused a similar “sepsis syndrome.” Research refocused on determining the driving mechanism(s) of noninfectious sepsis syndrome. At the time, new inflammatory mediators (known as “cytokines”) were recognized to be produced after a variety of insults and the term “cytokine storm” was popularized. Additionally, given that shock was a consistent inciting event for MOF, whole body ischemia-reperfusion causing systemic polymorphonuclear neutrophil (PMN) activation that initiated a diffuse endothelial cell injury was another attractive explanation. Clinical studies showed that early MOF after major trauma could be precipitated by a massive insult or two appropriately time lesser insults. Laboratory in vitro studies characterizing PMN “priming and activation,” which led the proposed two-hit model of MOF. This was validated in in vivo rodent models of MOF and was subsequently shown to be clinically relevant in human trauma studies. Finally, in the mid-1990s, the danger hypothesis was popularized based on the recognition that dying, necrotic, or pyroptotic cells release endogenous compounds called “damage-associated molecular patterns” (DAMPs) that acted through the same pattern recognition receptor (PPR) pathways (e.g., toll-like receptors [TLR]) that recognize microbial products called pattern-associated molecular patterns (PAMPs) to stimulate innate immune responses. This provided the mechanistic underpinnings for the role DAMPs (including mitochondrial DNA, HMGB1, S100A, and heat shock proteins) in eliciting inflammatory responses comparable to microbial DNA, endotoxin (LPS), and proteoglycans. Thus, it was widely recognized that infectious and noninfectious insults can elicit a similar sepsis syndrome which by this time was referred to as the systemic inflammatory response syndrome (SIRS).

Unrecognized shock

By the mid-1980s, pulmonary artery catheters (PACs) were being increasingly used for perioperative monitoring of high-risk surgical patients. In prospective studies, Dr. William Shoemaker recognized that nonsurvivors failed to develop a hyperdynamic response (characterized by high cardiac outputs and low systemic vascular resistance) during postoperative shock resuscitation and had persistent low systemic oxygen consumption (VO 2 ). He confirmed this in a prospective study of traumatic shock patients. Based on these experiences, he proposed the enticing hypothesis that persistent unrecognized shock as a result of impaired flow-dependent VO 2 was an important cause for noninfectious MOF. He championed the concept of supranormal oxygen delivery (DO 2 ) resuscitation to eliminate unrecognized oxygen debt as a strategy to prevent MOF. In the late 1980s with advances in PAC technology advances (including continuous cardiac output and mixed venous oximetry), it became standard of care in many U.S. trauma centers to presumptively place PACs in high-risk patients to guide early ICU resuscitation with the goal maximizing DO 2 . Although this strategy was ultimately disproven, the prospective data collected from established protocols provided convincing evidence that severity of shock (reflected by initial high base deficits and persistent elevated lactate levels) and impaired myocardial function played a predominant role in the later development of MOF. In these studies, greater than 6 units of packed red blood cell (PRBC) transfusion within the first 12 hours post injury was shown to be a consistent, independent, and strong predictor for early MOF. This finding plus other compelling evidence indicated that PRBCs played a major role in the priming/activation of PMNs. It was found that during PRBC storage, cell wall degradation produced proinflammatory lipids (e.g., platelet activating factor) that were robust priming agents for PMN superoxide and elastase production. Although washing PRBCs was reasonably effective in attenuating their priming capacity, this was logistically problematic for trauma patients requiring a massive transfusion. Recognizing that patients in traumatic shock require an oxygen carrier led to the investigation of the potential role of a hemoglobin-based oxygen carrier (HBOC) for early resuscitation. It was documented in vivo that PolyHeme (a human polymerized hemoglobin solution) avoided transfusion-induced PMN priming. Ultimately, this work led to a large multi-institutional, FDA-–approved phase III clinical trial of PolyHeme for the early resuscitation of critically injured patients. Patients resuscitated with PolyHeme, without stored blood for up to 6 units in 12 hours post injury, had outcomes comparable with those for the standard of care. It was concluded that the benefit-to-risk ratio of PolyHeme was favorable when blood is needed but not available. Unfortunately, PolyHeme did not receive final FDA approval because of concerns over higher-than-expected serious adverse events related to myocardial infarction. The quest for suitable HBOC is ongoing.

Epidemic of the abdominal compartment syndrome

By the late 1980s, surgical critical care was established as an essential component of trauma care throughout the United States. Simultaneously, trauma systems, Advanced Trauma Life Support (ATLS) and damage control surgery were universally adopted. As a result of these fundamental changes in trauma care, early mortality after severe trauma in the United States decreased substantially in the early 1990s. However, an epidemic abdominal compartment syndrome (ACS) emerged as more severely injured patients were surviving long enough to be admitted to the ICU. Into the early 2000s, ongoing prospective studies of high-risk patients treated by standardized protocols revealed that ACS was largely an iatrogenic complication resulting from delayed hemorrhage control, initial overzealous ATLS crystalloid resuscitation and futile supranormal DO 2 resuscitation in the ICU. With the widespread adoption of massive transfusion protocols, limiting early crystalloids, the FAST exam and whole-body CT scans coupled with emphasis on early hemorrhage control and the abandonment of PAC-directed ICU resuscitation, early death from exsanguination decreased and ACS became a notably less common event. Out of this experience, the concept hemostatic resuscitation was developed and refined. As a result of ongoing civilian and military research efforts it has evolved into damage control resuscitation, which is now standard of care worldwide and ACS has become a relative rare event.

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here