Shock, Electrolytes, and Fluid


Surgeons are the masters of fluids because they need to be. They care for patients who cannot eat or drink for various reasons; for example, they have hemorrhaged, undergone surgery, or lost fluids from tubes, drains, or wounds. Surgeons are obligated to know how to care for these patients, who put their lives in their hands. This topic might appear simple only for those who do not understand the complexities of the human body and its ability to regulate and compensate fluids. In reality, the task of managing patients’ blood volume is one of the most challenging burdens surgeons face, often requiring complete control of the intake and output of fluids and electrolytes and often in the presence of blood loss. Surgeons do not yet completely understand the physiology of shock and resuscitation, and their knowledge is superficial. Given the nature of the profession, they have studied those topics and dealt with patients who bleed and exsanguinate. Historically, wartime experience has always helped them move ahead in their knowledge of the management of fluids and how to better resuscitate. The recent wars in Iraq and Afghanistan are no exception as we have learned much from these wars.

Constant attention to and titration of fluid loss therapy is required because the human body is dynamic. The key to treatment is to realize what an individual patient’s initial condition is and to understand that their fluid status is constantly changing. Bleeding, sepsis, neuroendocrine disturbances, and dysfunctional regulatory systems can all affect patients who are undergoing the dynamic changes of illness and healing. The correct management of blood volume is highly time-dependent. If it is managed well, surgeons are afforded the chance to manage other aspects of surgery, such as nutrition, administration of antibiotics, drainage of abscesses, relief of obstruction and of incarceration, treatment of ischemia, and resection of tumors. Knowing the difference between dehydration, anemia, hemorrhage, and overresuscitation is vital.

The human body is predominantly water, which resides in the intravascular, intracellular, and interstitial (or third) space. Water movement between these spaces is dependent on many variables. This chapter focuses on the management of the intravascular space because it is the only space surgeons have direct access to, and managing the intravascular space is the only way to impact the other two fluid compartments.

This chapter also examines historical aspects of shock, fluids, and electrolytes—not just to note interesting facts or to pay tribute to deserving physicians, but also to try to understand how knowledge evolved over time. Doing so is vital to understanding past changes in management as well as to accept future changes. We are often awed at the discoveries of the past yet also astounded by how wrong we often were and why. Certainly, in turn, future surgeons will look back at our current body of knowledge and be amazed at how little we knew and how frequently we were wrong. A consequence of not studying the past is to repeat its errors.

After the historical highlights, this chapter discusses various fluids that are now used along with potential fluids under development. Finally, caring for perioperative patients is explored from a daily needs perspective.

History

History is disliked by those who are in a hurry to just learn the bottom line. Learning from the past, however, is essential to know which treatments have worked and which have not. Dogma must always be challenged and questioned. Were the current treatments based on science? Studying the history of shock is important for at least three reasons. First, physicians and physiologists have been fascinated with blood loss out of necessity. Second, we need to assess what experiments have or have not been done. Third, we need to know more, because our current understanding of shock is elementary.

Resuscitation

One of the earliest authenticated resuscitations in the medical literature is the “miraculous deliverance of Anne Green,” who was executed by hanging on December 14, 1650. Green was executed in the customary way by “being turned off a ladder to hang by the neck.” She hanged for half an hour, during which time some of her friends pulled “with all their weight upon her legs, sometimes lifting her up, and then pulling her down again with a sudden jerk, thereby the sooner to dispatch her out of her pain” ( Fig. 4.1 ). When everyone thought she was dead, the body was taken down, put in a coffin, and taken to the private house of Dr. William Petty, who, by the King’s orders, was allowed to perform autopsies on the bodies of all persons who had been executed.

Fig. 4.1, Miraculous deliverance of Anne Green, who was executed in 1650.

When the coffin was opened, Green was observed to take a breath, and a rattle was heard in her throat. Petty and his colleague, Thomas Willis, abandoned all thoughts of dissection and proceeded to revive their patient. They held her up in the coffin and then, by wrenching her teeth apart, poured hot cordial into her mouth, which caused her to cough. They rubbed and chafed her fingers, hands, arms, and feet; after a quarter of an hour of such effort, they put more cordial into her mouth. Then, after tickling her throat with a feather, she opened her eyes momentarily.

At that stage, they opened a vein and bled her of 5 ounces of blood. They continued administering the cordial and rubbing her arms and legs. Next, they applied compressing bandages to her arms and legs. Heating plasters were put to her chest, and another plaster was inserted as an enema “to give heat and warmth to her bowels.” They then placed Green in a warm bed with another woman to lie with her to keep her warm. After 12 hours, Green began to speak; 24 hours after her revival, she was answering questions freely. At 2 days, her memory was normal, apart from her recollection of her execution and the resuscitation.

Shock

Hemorrhagic shock has been extensively studied and written about for many centuries. Injuries, whether intentional or not, have occurred so frequently that much of the understanding of shock has been learned by surgeons taking care of the injured.

What is shock? The current widely accepted definition is inadequate perfusion of tissue. However, many subtleties lie behind this statement. Nutrients for cells are required, but which nutrients are not currently well defined. Undoubtedly, the most critical nutrient is oxygen, but concentrating on just oxygenation alone probably represents very elemental thinking. Blood is highly complex and carries countless nutrients, buffers, cells, antibodies, hormones, chemicals, electrolytes, and antitoxins. Even if we think in an elemental fashion and try to optimize the perfusion of tissue, the delivery side of the equation is affected by blood volume, anemia, and cardiac output (CO). Moreover, the use of nutrients is affected by infection and drugs. The vascular tone plays a role as well; for example, in neurogenic shock, the sympathetic tone is lost, and in sepsis, systemic vascular resistance decreases because of a broken homeostatic process or possibly because of evolutionary factors.

The term shock appears to have been first employed in 1743 in a translation of the French treatise of Henri Francois Le Dran regarding battlefield wounds. He used the term to designate the act of impact or collision, rather than the resulting functional and physiologic damage. However, the term can be found in the book Gunshot Wounds of the Extremities , published in 1815 by Guthrie, who used it to describe physiologic instability.

Humoral theories persisted until the late nineteenth century, but in 1830, Herman provided one of the first clear descriptions of intravenous (IV) fluid therapy. In response to a cholera epidemic, he attempted to rehydrate patients by injecting 6 ounces of water into the vein. In 1831, O’Shaughnessy also treated cholera patients by administering large volumes of salt solutions intravenously and published his results in Lancet. Those were the first documented attempts to replace and to maintain the extracellular internal environment or the intravascular volume. Note, however, that the treatment of cholera and dehydration is not the ideal treatment of hemorrhagic shock.

In 1872, Gross defined shock as “a manifestation of the rude unhinging of the machinery of life.” His definition, given its accuracy and descriptiveness, has been repeatedly quoted in the literature. Theories on the cause of shock persisted through the late nineteenth century; although it was unexplainable, it was often observed. George Washington Crile concluded that the lowering of the central venous pressure in the shock state in animal experiments was due to a failure of the autonomic nervous system. Surgeons witnessed a marked change in ideas about shock between 1888 and 1918. In the late 1880s, there were no all-encompassing theories, but most surgeons accepted the generalization that shock resulted from a malfunctioning of some part of the nervous system. Such a malfunctioning has now been shown to not be the main reason—but surgeons are still perplexed by the mechanisms of hemorrhagic shock, especially regarding the complete breakdown of the circulatory system that occurs in the later stages of shock.

In 1899, using contemporary advances with sphygmomanometers, Crile proposed that a profound decline in blood pressure (BP) could account for all symptoms of shock. He also helped alter the way physicians diagnosed shock and followed its course. Before Crile, most surgeons relied on respiration, pulse, or a declining mental status when evaluating the condition of patients. After Crile’s first books were published, many surgeons began measuring BP. In addition to changing how surgeons thought about shock, Crile was a part of the therapeutic revolution. His theories remained generally accepted for nearly two decades, predominantly in surgical circles. Crile’s work persuaded Harvey Cushing to measure BP in all operations, which in part led to the general acceptance of BP measurement in clinical medicine. Crile also concluded that shock was not a process of dying but rather a marshaling of the body’s defenses in patients struggling to live. He later deduced that the reduced volume of circulating blood, rather than the diminished BP, was the most critical factor in shock.

Crile’s theories evolved as he continued his experimentations; in 1913, he proposed the kinetic system theory. He was interested in thyroid hormone and its response to wounds but realized that epinephrine was a key component of the response to shock. He relied on experiments by Walter B. Cannon, who found that epinephrine was released in response to pain or emotion, shifting blood from the intestines to the brain and extremities. Epinephrine release also stimulated the liver to convert glycogen to sugar for release into the circulation. Cannon argued that all the actions of epinephrine aided the animal in its effort to defend itself.

Crile incorporated Cannon’s study into his theory. He proposed that impulses from the brain after injury stimulated glands to secrete their hormones, which, in turn, effected sweeping changes throughout the body. Crile’s kinetic system included a complex interrelationship among the brain, heart, lungs, blood vessels, muscles, thyroid gland, and liver. He also noted that if the body received too much stress, the adrenal glands would run out of epinephrine, the liver of glycogen, the thyroid of its hormone, and the brain itself of energy, accounting for autonomic changes. Once the kinetic system ran out of energy, BP would fall, and the organism would go into shock.

Henderson recognized the importance of decreased venous return and its effect on cardiac output and arterial pressure. His work was aided by advances in techniques that allowed careful recording of the volume curves of the ventricles. Fat embolism also led to a shock-like state, but its possible contribution was questioned because study results were difficult to reproduce. The vasomotor center and its contributions in shock were heavily studied in the early 1900s. In 1914, Mann noted that unilaterally innervated vessels of the tongues of dogs, ears of rabbits, and paws of kittens appeared constricted during shock compared with contralaterally denervated vessels.

Battlefield experiences continued to intensify research on shock. During the World War I era, Cannon used clinical data from the war as well as data from animal experiments to examine the shock state carefully. He theorized that toxins and acidosis contributed to the previously described lowering of vascular tone. He and others then focused on acidosis and the role of alkali in preventing and prolonging shock. The adrenal gland and the effect of cortical extracts on adrenalectomized animals were of fascination during this period.

Then, in the 1930s, a unique set of experiments by Blalock determined that almost all acute injuries are associated with changes in fluid and electrolyte metabolism. Such changes were primarily the result of reductions in the effective circulating blood volume. Blalock showed that those reductions after injury could be the result of several mechanisms ( Box 4.1 ). He clearly showed that fluid loss in injured tissues was loss of extracellular fluid (ECF) that was unavailable to the intravascular space for maintaining circulation. The original concept of a “third space,” in which fluid is sequestered and therefore unavailable to the intravascular space, evolved from Blalock’s studies.

Box 4.1
Causes of shock according to Blalock in 1930.
Data from Blalock A. Principles of surgical care: Shock and other problems . St. Louis, MO: CV Mosby; 1940.

  • Hematogenic (oligemia)

  • Neurogenic (caused primarily by nervous influences)

  • Vasogenic (initially decreased vascular resistance and increased vascular capacity, as in sepsis)

  • Cardiogenic (failure of the heart as a pump as in cardiac tamponade or myocardial infarction)

  • Large volume loss (extracellular fluid, as occurs in patients with diarrhea, vomiting, and fistula drainage)

Carl John Wiggers first described the concept of “irreversible shock.” His 1950 textbook, Physiology of Shock, represented the attitudes toward shock at that time. In an exceptionally brilliant summation, Wiggers assembled the various signs and symptoms of shock from various authors in that textbook ( Fig. 4.2 ), along with his own findings.

Fig. 4.2, Wiggers’ description of symptom complex of shock.

His experiments used what is now known as the Wiggers prep. In his most common experiments, he used previously splenectomized dogs and cannulated the arterial system. He took advantage of an evolving technology that allowed him to measure the pressure within the arterial system, and he studied the effects of lowering BP through blood withdrawal. After removing the dogs’ blood to an arbitrary set point (typically, 40 mm Hg), he noted that their BP soon spontaneously rose as fluid was spontaneously recruited into the intravascular space.

To keep the dogs’ BP at 40 mm Hg, Wiggers had to continually withdraw additional blood. During this compensated phase of shock, the dogs could use their reserves to survive. Water was recruited from the intracellular compartment as well as from the extracellular space. The body tried to maintain the vascular flow necessary to survive. However, after a certain period, he found that to keep the dogs’ BP at the arbitrary set point of 40 mm Hg, he had to reinfuse shed blood; he termed this phase uncompensated , or irreversible, shock . Eventually, after a period of irreversible shock, the dogs died.

The ideal model is uncontrolled hemorrhage, but its main problem is that the volume of hemorrhage is uncontrolled by the nature of the experiment. Variability is the highest in this model even though it is the most realistic. Computer-assisted pressure models that mimic the pressures during uncontrolled shock can be used to reduce the artificiality of the pressure-controlled model. Smith and colleagues developed a hybrid model of controlled, uncontrolled hemorrhage whereby a standardized grade V liver laceration is made in swine. The swine bleed to either a specified pressure or fixed volume, and bleeding is controlled with packing. This removes the variability classically associated with uncontrolled hemorrhage.

Fluids

How did the commonly used IV fluids, such as normal saline, enter medical practice? It is often taken for granted, given the vast body of knowledge in medicine, that they were adopted through a rigorous scientific process, but that was not the case.

Normal saline has a long track record and is extremely useful, but we now know that it also can be harmful. Hartog Jakob Hamburger, in his in vitro studies of red cell lysis in 1882, incorrectly suggested that 0.9% saline was the concentration of salt in human blood. He chose 0.9% saline because it has the same freezing point as human serum. This fluid is often referred to as physiologic or normal saline, but it is neither physiologic nor normal.

In 1831, O’Shaughnessy described his experience in the treatment of cholera:

Universal stagnation of the venous system, and rapid cessation of the arterialization of the blood, are the earliest, as well as the most characteristic effects. Hence the skin becomes blue—hence animal heat is no longer generated—hence the secretions are suspended; the arteries contain black blood, no carbonic acid is evolved from the lungs, and the returned air of expiration is cold as when it enters these organs.

O’Shaughnessy wrote those words at the age of 22, having just graduated from Edinburgh Medical School. He tested his new method of infusing IV fluids on a dog and observed no ill effects. Eventually, he reported that the aim of his method was to restore blood to its natural specific gravity and to restore its deficient saline matters. His experience with human cholera patients taught him that the practice of bloodletting, then highly common, was good for “diminishing the venous congestion” and that nitrous oxide (laughing gas) was not useful for oxygenation.

In 1832, Robert Lewins reported that he witnessed Thomas Latta injecting extraordinary quantities of saline into veins, with the immediate effects of “restoring the natural current in the veins and arteries, of improving the color of the blood, and [of] recovering the functions of the lungs.” Lewins described Latta’s saline solution as consisting of “two drachms of muriate, and two scruples of carbonate, of soda, to sixty ounces of water.” Later, however, Latta’s solution was found to equate to having 134 mmol per liter of Na + , 118 mmol per liter of Cl , and 16 mmol per liter of bicarbonate (HCO 3 ).

During the next 50 years, many reports cited various recipes used to treat cholera, but none resembled 0.9% saline. In 1883, Sydney Ringer reported on the influence exerted by the constituents of the blood on the contractions of the ventricle ( Fig. 4.3 ). Studying an isolated heart model from frogs, he used 0.75% saline and a blood mixture made from dried bullocks’ blood. In his attempts to identify which aspect of blood caused better results, he found that a “small quantity of white of egg completely obviates the changes occurring with saline solution.” He concluded that the benefit of “white of egg” was because of the albumin or the potassium chloride. To show what worked and what did not, he described endless experiments with alterations of multiple variables.

Fig. 4.3, Sydney Ringer, credited for the development of lactated Ringer solution.

However, Ringer later published another article stating that his previously reported findings could not be repeated; through careful study, he realized that the water used in his first article was actually not distilled water, as reported, but rather tap water from the New River Water Company. It turned out that his laboratory technician, who was paid to distill the water, took shortcuts and used tap water instead. Ringer analyzed the water and found that it contained many trace minerals ( Fig. 4.4 ). Through careful and diligent experimentation, he found that calcium bicarbonate or calcium chloride—in doses even smaller than in blood—restored good contractions of the frog ventricles. The third component that he found essential to good contractions was sodium bicarbonate. Thus, the three ingredients that he found essential were potassium, calcium, and bicarbonate. Ringer solution soon became ubiquitous in physiologic laboratory experiments.

Fig. 4.4, Sidney Ringer’s report of contents in water from the New River Water Company.

In the early twentieth century, fluid therapy by injection under the skin (hypodermoclysis) and infusion into the rectum (proctoclysis) became routine. Hartwell and Hoguet reported its use in intestinal obstruction in dogs, laying the foundation for saline therapy in human patients with intestinal obstruction.

As IV crystalloid solutions were developed, Ringer solution was modified, most notably by pediatrician Alexis Hartmann. In 1932, wanting to develop an alkalinizing solution to administer to his acidotic patients, Hartmann modified Ringer solution by adding sodium lactate. The result was lactated Ringer (LR) or Hartmann solution. He used sodium lactate (instead of sodium bicarbonate); the conversion of lactate into sodium bicarbonate was sufficiently slow to lessen the danger posed by sodium bicarbonate, which could rapidly shift patients from compensated acidosis to uncompensated alkalosis.

In 1924, Rudolph Matas, regarded as the originator of modern fluid treatment, introduced the concept of the continued IV drip but also warned of potential dangers of saline infusions. He stated, “Normal saline has continued to gain popularity but the problems with metabolic derangements have been repeatedly shown but seem to have fallen on deaf ears.” In healthy volunteers, modern-day experiments have shown that normal saline can cause abdominal discomfort and pain, nausea, drowsiness, and decreased mental capacity to perform complex tasks.

The point is that normal saline and LR solution have been formulated for conditions other than the replacement of blood, and the reasons for the formulation are archaic. Such solutions have been useful for dehydration; when they are used in relatively small volumes (1–3 L/day), they are well tolerated and relatively harmless; they provide water, and the human body can tolerate the amounts of electrolytes they contain. Over the years, LR solution has attained widespread use for treatment of hemorrhagic shock. However, normal saline and LR solution are mostly permeable through the vascular membrane, but they are poorly retained in the vascular space. After a few hours, only about 175 to 200 mL of a 1-L infusion remains in the intravascular space. In countries other than the United States, LR solution is often referred to as Hartmann solution, and normal saline is referred to as physiologic (sometimes even spelled fisiologic ) solution. With the advances in science in the last 50 years, it is difficult to understand why advances in resuscitation fluids have not been made.

Blood Transfusions

Concerned about the blood that injured patients lost, Crile began to experiment with blood transfusions. As he stated, “After many accidents, profuse hemorrhage often led to shock before the patient reached the hospital. Saline solutions, adrenalin, and precise surgical technique could substitute only up to a point for the lost blood.” At the turn of the nineteenth century, transfusions were seldom used. Their use waxed and waned in popularity because of transfusion reactions and difficulties in preventing clotting in donated blood. Through his experiments in dogs, Crile showed that blood was interchangeable: he transfused blood without blood group matching. Alexis Carrel was able to sew blood vessels together with his triangulation technique, using it to connect blood vessels from one person to another for the purpose of transfusions. However, Crile found Carrel’s technique too slow and cumbersome in humans, so he developed a short cannula to facilitate transfusions.

By the time World War II occurred, shock was recognized as the single most common cause of treatable morbidity and mortality. At the time of the Japanese attack on Pearl Harbor on December 7, 1941, no blood banks or effectual blood transfusion facilities were available. Most military locations had no stocks of dried pooled plasma. Although the wounded of that era were evacuated quickly to a hospital, the mortality rate was still high. IV fluids of any kind were essentially unavailable, except for a few liters of saline manufactured by means of a still in the operating room. IV fluid was usually administered by an old Salvesen flask and a reused rubber tube. Often, a severe febrile reaction resulted from the use of that tubing.

The first written documentation of resuscitation in World War II patients was 1 year after Pearl Harbor, in December 1942, in notes from the 77th Evacuation Hospital in North Africa. E. D. Churchill stated, “The wounded in action had for the most part either succumbed or recovered from any existing shock before we saw them. However, later cases came to us in shock, and some of the early cases were found to be in need of whole blood transfusion. There was plenty of reconstituted blood plasma available. However, some cases were in dire need of whole blood. We had no transfusion sets, although such are available in the United States: no sodium citrate; no sterile distilled water; and no blood donors.”

The initial decision to rely on plasma rather than on blood appears to have been based in part on the view held in the Office of the Surgeon General of the Army and in part on the opinion of the civilian investigators of the National Research Council. Those civilian investigators thought that, in shock, the blood was thick and the hematocrit level was high. On April 8, 1943, the Surgeon General stated that no blood would be sent to the combat zone. Seven months later, he again refused to send blood overseas because of the following: (1) his observation of overseas theaters had convinced him that plasma was adequate for resuscitation of wounded men; (2) from a logistics standpoint, it was impractical to make locally collected blood available farther forward than general hospitals in the combat zone; and (3) shipping space was too sparse. Vasoconstricting drugs such as epinephrine were condemned because they were thought to decrease blood flow and tissue perfusion as they dammed the blood in the arterial portion of the circulatory system.

During World War II, out of necessity, efforts to make blood transfusions available heightened and led to the institution of blood banking for transfusions. Better understanding of hypovolemia and inadequate circulation led to the use of plasma as a favored resuscitative solution, in addition to whole blood replacement. Thus, the treatment of traumatic shock greatly improved. The administration of whole blood was thought to be extremely effective, so it was widely used. Mixing whole blood with sodium citrate in a 6:1 ratio to bind the calcium in the blood, which prevented clotting, worked well.

However, no matter what solution was used—blood, colloids, or crystalloids—the blood volume seemed to increase by only a fraction of what was lost. In the Korean War era, it was recognized that more blood had to be infused for the blood volume lost to be adequately regained. The reason for the need for more blood was unclear, but it was thought to be due to hemolysis, pooling of blood in certain capillary beds, and loss of fluid into tissues. Considerable attention was given to elevating the feet of patients in shock.

Physiology of Shock

Bleeding

Research and experience have both taught us much about the physiologic responses to bleeding. The advanced trauma life support (ATLS) course defines four classes of shock ( Table 4.1 ). In general, that categorization has helped point out the physiologic responses to hemorrhagic shock, emphasizing the identification of blood loss and guiding treatment. Conceptually, shock occurs at three anatomical areas of the cardiovascular system ( Fig. 4.5 ). The first level occurs at the heart where cardiogenic abnormalities can be either extrinsic (tension pneumothorax, hemothorax, or cardiac tamponade) or intrinsic (myocardial infarction causing pump failure, cardiac contusion or laceration, or cardiac failure). The second level occurs at the large or medium vessel level in which hemorrhage and loss of blood volume leads to shock. The last level occurs with the small vessels in which either neurologic dysfunction or sepsis leads to vasodilatation and maldistribution of the blood volume leading to shock.

Table 4.1
ATLS classes of hemorrhagic shock.
CLASS I CLASS II CLASS III CLASS IV
Blood loss (%) 0–15 15–30 30–40 >40
Central nervous system Slightly anxious Mildly anxious Anxious or confused Confused or lethargic
Pulse (beats/min) <100 >100 >120 >140
Blood pressure Normal Normal Decreased Decreased
Pulse pressure Normal Decreased Decreased Decreased
Respiratory rate 14–20/min 20–30/min 30–40/min >35/min
Urine (mL/hr) >30 20–30 5–15 Negligible
Fluid Crystalloid Crystalloid Crystalloid + blood Crystalloid + blood
ATLS, Advanced trauma life support.

Fig. 4.5, Types of shock.

The four classes of shock according to the ATLS course are problematic as they were not rigorously tested or proven and were admittedly arbitrarily generated. Patients often do not exhibit all of the physiologic changes described by this table, particularly those at age extremes. Due to higher water composition of their bodies, children are able to compensate with large volumes of blood loss, often exhibiting only tachycardia until they reach a tipping point where they are no longer able to compensate, at which point they have a rapid clinical decline. Elderly patients show almost an opposite physiology, as they are less equipped to compensate for blood loss and will show signs of a higher level of shock at a lower volume of blood loss. This is due to a reduced ability of cardiac compensation and fluid reserve recruitment.

The problem with the signs and symptoms classically taught in ATLS classes is that, in reality, the manifestations of shock can be confusing and difficult to assess, particularly in trauma patients. For example, changes in mental status can be caused by blood loss, traumatic brain injury (TBI), pain, or illicit drugs. The same dilemma applies for respiratory rate and skin changes. Are alterations in a patient’s respiratory rate or skin color caused by pneumothorax, rib fracture pain, or inhalation injury?

Although there are various methods that have been developed for monitoring patients in shock, BP continues to be the most clinically useful measure. When caring for a patient in shock, goals of resuscitation need to be established, remembering that baseline BP and blood volume are extremely variable and often unknown while initiating treatment. Although there is no single universally applicable endpoint of resuscitation, a combination of normalization of serum lactate, base deficit, pH, and hemorrhage control, if applicable, are markers that can be considered along with the rest of the patient’s overall clinical status.

Clinical symptoms are relatively few in patients who are in class I shock with the exception of anxiety. Is the anxiety after injury from blood loss, pain, trauma, or drugs? A heart rate higher than 100 beats/min has been used as a physical sign of bleeding, but evidence of its significance is minimal. Brasel and collegues have shown that heart rate was neither sensitive nor specific in determining the need for emergent intervention, the need for packed red blood cell (PRBC) transfusions in the first 2 hours after an injury, or the severity of the injury. Heart rate was not altered by the presence of hypotension (systolic BP <90 mm Hg).

In patients who are in class II shock, we are taught that their heart rate is increased, but, as previously mentioned, this is a highly unreliable marker; pain and mere nervousness can also increase heart rate. The change in pulse pressure, the difference between systolic and diastolic pressure, is also difficult to identify because the baseline BP of patients is not always known. The change in pulse pressure is thought to be caused by an epinephrine response constricting vessels, resulting in higher diastolic pressures.

Not until patients are in class III shock does BP theoretically decrease. At this stage, patients have lost 30% to 40% of their blood volume; for an average man weighing 75 kg (168 pounds), this equates to 2 L of blood loss ( Fig. 4.6 ). It is helpful to remember that a can of soda or beer is 355 mL; a six-pack is 2130 mL. Theoretically, if a patient is hypotensive from blood loss, they have loss the equivalent of a six-pack of blood. The first and most important key when a patient is in shock due to hemorrhage is recognizing that blood loss is the cause of their shock and identify the source of bleeding and treat it. Resuscitation occurs simultaneously as needed.

Fig. 4.6, Liters of blood lost for class III shock, or 40% of 5 L, according to the advanced trauma life support (ATLS).

Since ATLS is designed for physicians who are not surgeons, many subtleties around the physiology of a bleeding patient are missing. However, surgeons know that there are some nuances of the varied responses to injuries in both animals and humans. In the case of arterial hemorrhage, for example, animals do not necessarily manifest tachycardia as their first response when bleeding, but actually become bradycardic. It is speculated that this is a teleological mechanism as a bradycardic response reduces cardiac output and minimizes uncontrolled exsanguination. A bradycardic response to bleeding is not consistently shown in all animals, including humans. Some evidence shows that this response, termed relative bradycardia, does occur in humans. Relative bradycardia is defined as a heart rate less than 100 beats/min while simultaneously having a systolic BP below 90 mm Hg. When bleeding patients have relative bradycardia, their mortality rate is lower. Up to 44% of hypotensive patients who are not bleeding have relative bradycardia. However, this lower heart rate is only protective to a certain level as patients with a heart rate below 60 beats/min are usually moribund. Bleeding patients with a heart rate of 60 to 90 beats/min have the highest survival rate compared with patients with tachycardia (a heart rate of more than 90 beats/min).

The physiologic response to bleeding also subtly differs according to whether the source of bleeding is arterial or venous. Arterial bleeding is an obvious problem, but it often stops temporarily on its own; the human body has evolved to trap the blood loss in adventitial tissues, and the transected artery will spasm and thrombose. A lacerated artery can actually bleed more than a transected artery as the spasm of the lacerated artery can enlarge the hole in the vessel. Thrombosis of the artery sometimes does not occur in transected or lacerated vessels. Also, since the arterial system does not have valves, the recorded BP can drop early even before large-volume loss has occurred. In these patients with arterial bleeding, hypotension may occur soon, but because ischemia has not yet had a chance to occur, measurements of lactate or base deficit can yield normal results.

In contrast, venous bleeding is typically slower, allowing the human body time to compensate. This slower progression provides the time necessary for recruitment of water from the intracellular and interstitial spaces. This leads to large volumes of blood that can be lost before hypotension ensues. Since venous or capillary bed bleeding is slower and the body has a chance to compensate, this allows for tissue ischemia to develop during the process and thus there is time for lactate and base deficit results to be abnormal. Venous blood loss can be massive before hypotension occurs.

It is generally taught that the hematocrit or hemoglobin (Hgb) level is not reliable in predicting blood loss. This can be true in patients who have not been resuscitated, but in patients who have received crystalloids, a rapid drop in the hematocrit and hemoglobin levels can occur. Bruns and associates have shown that the hemoglobin level can be low within the first 30 minutes after patients arrive at trauma centers. Therefore, high or normal hemoglobin levels do not rule out significant bleeding. But, a low hemoglobin level, because it occurs rapidly, generally reflects severe blood loss.

The lack of good indicators to distinguish which patients are bleeding has led many investigators to examine heart rate variability or complexity as a potential new vital sign. Many clinical studies have shown that heart rate variability or complexity is associated with poor outcome. Heart rate variability or complexity would have to be calculated using software, with a resulting index on which clinicians would have to rely. This information would not be available by merely examining patients. Another issue with heart rate variability or complexity is that the exact physiologic mechanism for its association with poor outcome has yet to be elucidated. This new vital sign may be programmable into currently used monitors, but its usefulness has yet to be confirmed.

Hypotension has been traditionally set, arbitrarily, at 90 mm Hg and below. But this level can be variable from patient to patient, especially depending on age. Eastridge and coworkers have suggested that hypotension be redefined as below 110 mm Hg. In 2008, Bruns and colleagues confirmed the concept, showing that a prehospital BP below 110 mm Hg was associated with a sharp increase in mortality and that 15% of patients with BP below 110 mm Hg would eventually die in the hospital. As a result, they recommended redefining prehospital trauma triage criteria. In older patients, normal vital signs may miss occult hypoperfusion as geriatric patients often have increased lactate and base deficit levels.

Shock Index

Since heart rate and systolic BP individually are not accurate at identifying hemorrhagic shock and because the traditionally taught combination of tachycardia and decreased systolic BP does not always occur together, the shock index (SI), which uses these two variables together, was developed. SI is defined as heart rate divided by systolic BP. It has been shown to be a better marker for assessing severity of shock than heart rate and BP alone. It has utility not only in trauma patients often in hemorrhagic shock, but also in patients who are in shock from other causes such as sepsis, obstetrics, myocardial infarction, stroke, and other acute illnesses. In the trauma population, SI has been shown to be more useful than heart rate and BP alone, and it has also been shown to be of benefit specifically in the pediatric and geriatric populations. It has been correlated with need for interventions such as blood transfusion and invasive procedures including operations. SI is known as a hemodynamic stability indicator. However, SI does not consider the diastolic BP, and thus a modified SI (MSI) was created. MSI is defined as heart rate divided by mean arterial pressure. As MSI rises, this indicates a low stroke volume and low systemic vascular resistance, a sign of hypodynamic circulation. In contrast, low MSI indicates a hyperdynamic state. MSI has been considered a better marker than SI for mortality rate prediction. Although SI or MSI is better than heart rate and systolic BP alone, the combination of one of these variables with heart rate and systolic BP will undoubtedly be more useful. There are additional studies showing that more complex calculations with more variables are more useful than simpler ones. For example, taking into account the age, mechanism of injury, Glasgow Coma Scale (GCS) score, lactate levels, hemoglobin levels, and other physiologic parameters will result in statistically better prediction than with one individual vital sign. It is intuitive that the addition of variables would be more predictive of outcome. That is why the presence of an experienced surgeon is critical; in a few seconds, the astute clinician will quickly consider multiple variables, including gender, age, GCS score, mechanism of injury, and other parameters. Whereas SI and MSI are statistically more accurate than one individual parameter, there is no substitute for the experienced clinician at the bedside. This may be the reason that SI and MSI have not been widely adopted.

Lactate and Base Deficit

Lactate has stood the test of time as an associated marker of injury and possibly ischemia. However, new data question the etiology and role of lactate. The emerging information is confusing; it suggests that we may not understand lactate for what it truly implies. Lactate has long been thought to be a byproduct of anaerobic metabolism and is routinely perceived to be an end waste product that is completely unfavorable. Physiologists are now questioning this paradigm and have found that lactate behaves more advantageously than not. An analogy would be that firefighters are associated with fires, but that does not mean that firefighters are bad nor does it mean that they caused the fires.

Research has shown that lactate increases in muscle and blood during exercise. It is at its highest level at or just after exhaustion, which led to the assumption that lactate was a waste product. In addition, we also know that lactic acid appears in response to muscle contraction and continues in the absence of oxygen. Furthermore, accumulated lactate disappears when an adequate supply of oxygen is present in tissues.

Recent evidence indicates that lactate is an active metabolite, capable of moving between cells, tissues, and organs where it may be oxidized as fuel or reconverted to form pyruvate or glucose. It now appears that increased lactate production and concentration, as a result of anoxia or dysoxia, are often the exception rather than the rule. Lactate seems to be a shuttle for energy; the lactate shuttle is now the subject of much debate. The end product of glycolysis is pyruvic acid. Lack of oxygen is thought to convert pyruvate into lactate. However, lactate formation may allow carbohydrate metabolism to continue through glycolysis. It is postulated that lactate is transferred from its site of production in the cytosol to neighboring cells and to a variety of organs (e.g., heart, liver, and kidney), where its oxidation and continued metabolism can occur.

Lactate is also being studied as a pseudohormone as it seems to regulate the cellular redox state through exchange and conversion into pyruvate and through its effects on the ratio of nicotinamide adenine dinucleotide to nicotinamide adenine dinucleotide (reduced)—the NAD + /NADH ratio. It is released into the systemic circulation and taken up by distal tissues and organs, where it also affects the redox state in those cells. Further evidence has shown that it affects wound regeneration with promotion of increased collagen deposition and neovascularization. Lactate may also induce vasodilatation and catecholamine release and stimulate fat and carbohydrate oxidation.

Lactate levels in blood are highly dependent on the equilibrium between production and elimination from the bloodstream. The liver is predominantly responsible for clearing lactate, and liver disease affects lactate levels. Lactate was always thought to be produced from anaerobic tissues, but it now seems that a variety of tissue beds that are not undergoing anaerobic metabolism produce lactate when they are signaled with distress.

In canine muscle, lactate is produced by moderate-intensity exercise when the oxygen supply is ample. A high adrenergic stimulus also causes a rise in lactate as the body prepares for or responds to stress. A study of climbers of Mount Everest showed that the resting P o 2 on the summit was about 28 mm Hg and decreased even more during exercise. The blood lactate level in those climbers was essentially the same as at sea level even though they were in a state of hypoxia. These facts lead us to question what we believed we knew about lactate and its true role.

In humans, lactate may be the preferred fuel in the brain and heart; in these tissues, infused lactate is used before glucose at rest and during exercise. Since lactate is glucose sparing, it allows glucose and glycogen levels to be maintained. In addition to lactate’s being preferred in the brain, evidence seems to indicate that lactate is protective to brain tissues in TBI and acts as fuel during exercise for the brain. The level of lactate, whether it is a waste product or a source of energy, seems to signify tissue distress whether it is from anaerobic conditions or other factors. During times of stress, there is a release of epinephrine and other catecholamines, which also causes a release of lactate.

Base deficit, a measure of the number of millimoles of base required to correct the pH of a liter of whole blood to 7.4, seems to correlate well with lactate level, at least in the first 24 hours after a physiologic insult. Rutherford, in 1992, showed that a base deficit of 8 was associated with a 25% mortality rate in patients older than 55 years without a head injury or in patients younger than 55 years with a head injury. When base deficit remains elevated, most clinicians believe that it is an indication of ongoing shock.

One problem with base deficit is that it is commonly influenced by the chloride in various resuscitation fluids, resulting in a hyperchloremic nongap acidosis. In patients with renal failure, base deficit can also be a poor predictor of outcome; in the acute stage of renal failure, a base deficit of less than 6 mmol/L is associated with poor outcome. With the use of hypertonic saline (HTS), which has three to eight times the sodium chloride concentration of normal saline, the hyperchloremic acidosis has been shown to be relatively harmless. However, when HTS is used, base deficit should be interpreted with caution.

Compensatory Mechanisms

When shock occurs, blood flow is diverted from less critical to more critical tissues. The earliest compensatory mechanism in response to a decrease in intravascular volume is an increase in sympathetic activity. Such an increase is mediated by pressure receptors or baroreceptors in the aortic arch, atria, and carotid bodies. A decrease in pressure inhibits parasympathetic discharge while norepinephrine and epinephrine are liberated and cause adrenergic receptors in the myocardium and vascular smooth muscle to be activated. Heart rate, contractility, and peripheral vascular resistance are increased, resulting in increased BP. However, the various tissue beds are not affected equally; blood is shunted from less critical organs (e.g., skin, skeletal muscle, and splanchnic circulation) to more critical organs (e.g., brain, liver, and kidneys).

The juxtaglomerular apparatus in the kidney, in response to the vasoconstriction and decrease in blood flow, produces the enzyme renin, which leads to the generation of angiotensin I. The angiotensin-converting enzyme located on the endothelial cells of the pulmonary arteries converts angiotensin I to angiotensin II. In turn, angiotensin II stimulates an increased sympathetic drive at the level of the nerve terminal by releasing hormones from the adrenal medulla. In response, the adrenal medulla affects intravascular volume during shock by secreting catechol hormones—epinephrine, norepinephrine, and dopamine—which are all produced from phenylalanine and tyrosine. They are called catecholamines because they contain a catechol group derived from the amino acid tyrosine. The release of catecholamines is thought to be responsible for the elevated glucose level in hemorrhagic shock. Although the role of glucose elevation in hemorrhagic shock is not fully understood, it does not seem to affect outcome.

Cortisol, also released from the adrenal cortex, plays a major role in fluid equilibrium. In the adrenal cortex, the zona glomerulosa produces aldosterone in response to stimulation by angiotensin II. Aldosterone is a mineralocorticoid that modulates renal function by increasing recovery of sodium and excretion of potassium. Angiotensin II also causes the reabsorption of sodium through a direct action on the renal tubules. Sodium is the primary osmotic ion in the human body in the regulation of water balance, with the reabsorption of sodium leading to the reabsorption of water, which subsequently leads to intravascular volume expansion in response to shock. One problem is that the release of these hormones is not infinite, thus the supply can be exhausted in a state of ongoing shock.

This regulation of intravascular fluid status is further affected by the carotid baroreceptors and the atrial natriuretic peptides. Signals are sent to the supraoptic and paraventricular nuclei in the brain. Antidiuretic hormone (ADH) is released from the pituitary, causing retention of free water at the level of the kidney. Simultaneously, volume is recruited from the extravascular and cellular spaces. A shift of water occurs as hydrostatic pressures fall in the intravascular compartment. At the capillary level, hydrostatic pressures also are reduced because the precapillary sphincters are vasoconstricted more than the postcapillary sphincters.

Lethal Triad

The lethal triad of acidosis, hypothermia, and coagulopathy is common in resuscitated patients who are bleeding or in shock from various factors. Our basic understanding is that inadequate tissue perfusion results in acidosis caused by lactate production. In the shock state, the delivery of nutrients to the cells is thought to be inadequate, leading to a decrease in the body’s main energy storage molecule, adenosine triphosphate (ATP). The human body relies on ATP production to maintain homeostatic temperatures, like all homeothermic (warm-blooded) animals do. Thus, if ATP production is inadequate to maintain body temperature, the body will trend toward the ambient temperature. For most human patients, this is 22°C (72°F), the temperature inside typical hospitals. The resulting hypothermia and acidosis then affect the efficiency of enzymes, which work best at 37°C and a pH of 7.4. For surgeons, the critical issue with hypothermia is the coagulation cascade dependence on enzymes that are affected by hypothermia. If enzymes are not functioning optimally due to hypothermia, coagulopathy worsens, which can contribute to uncontrolled bleeding from injuries or the surgery itself. Further bleeding continues to fuel the triad. The optimal method to break the “vicious circle of death” is to stop the bleeding and the causes of hypothermia. In most typical scenarios, hypothermia is not spontaneous from ischemia but is induced because of use of room temperature fluid or cold blood products.

Acidosis

Bleeding causes a host of responses. During the resuscitative phase, the lethal triad (acidosis, hypothermia, and coagulopathy) is frequent in severely bleeding patients, most likely because of two major factors. First, decreased perfusion causes lactic acidosis and consumptive coagulopathy. Second, room temperature and large volume fluids lead to worsening hypothermia and dilutional coagulopathy, creating a resuscitation injury. Some believe that the acidotic state is not necessarily undesirable because the body tolerates acidosis better than alkalosis. Oxygen is more easily offloaded from hemoglobin in the acidotic environment. Basic scientists who try to preserve tissue ex vivo find that cells live longer in an acidotic environment. Correcting acidosis with sodium bicarbonate has classically been avoided as it is treating a laboratory value or symptom. The focus should be to correct the cause of the acidosis. Treating the pH alone has shown no benefit, but it can lead to complacency. It is also argued that rapidly injecting sodium bicarbonate can worsen intracellular acidosis from the diffusion of the converted CO 2 into the cells.

The best fundamental approach to metabolic acidosis from shock is to treat the underlying cause of shock. In the surgeon’s case, this is due to blood loss or ischemic tissue. However, some clinicians believe that treating the pH has advantages because the enzymes necessary for clotting function better at an optimal temperature and optimal pH. Coagulopathy can contribute to uncontrolled bleeding, so some have recommended treating acidosis with bicarbonate infusion for patients in dire scenarios. Treating acidosis with sodium bicarbonate may have a benefit in an unintended and unrecognized way. Rapid infusion of bicarbonate is usually accompanied by a rise in BP in hypotensive patients. This rise is usually attributed to correcting the pH; however, sodium bicarbonate in most urgent scenarios is given in ampules. The 50-mL ampule of sodium bicarbonate has 1 mEq/mL—in essence, similar to giving a hypertonic concentration of sodium, which quickly draws fluid into the vascular space. Given its high sodium concentration, a 50-mL bolus of sodium bicarbonate has physiologic results similar to 325 mL of normal saline or 385 mL of LR solution. Essentially, it is like giving small doses of HTS. Sodium bicarbonate quickly increases CO 2 levels by its conversion in the liver, so if the minute ventilation is not increased, respiratory acidosis can result.

THAM (tromethamine; tris[hydroxymethyl] aminomethane) is a biologically inert amino alcohol of low toxicity that buffers CO 2 and acids. It is sodium free and limits the generation of CO 2 in the process of buffering. At 37°C, the p K a of THAM is 7.8, making it a more effective buffer than sodium bicarbonate in the physiologic range of blood pH. In vivo, THAM supplements the buffering capacity of the blood bicarbonate system by generating sodium bicarbonate and decreasing the partial pressure of CO 2 . It rapidly distributes to the extracellular space and slowly penetrates the intracellular space, except in the case of erythrocytes and hepatocytes, and it is excreted by the kidney. Unlike sodium bicarbonate, which requires an open system to eliminate CO 2 to exert its buffering effect, THAM is effective in a closed or semiclosed system, and it maintains its buffering ability during hypothermia. THAM acetate (0.3 M, pH 8.6) is well tolerated, does not cause tissue or venous irritation, and is the only formulation available in the United States. THAM may induce respiratory depression and hypoglycemia, which may require ventilatory assistance and the administration of glucose.

The initial loading dose of THAM acetate (0.3 M) for the treatment of acidemia may be estimated as follows:


THAM ( in milliliters of 0.3 M solution ) = Lean body Weight ( in kilograms ) × Base deficit ( in millimoles per liter )

The maximal daily dose is 15 mmol/kilogram/day for an adult (3.5 L of a 0.3-M solution in a patient weighing 70 kg). It is indicated in the treatment of respiratory failure (acute respiratory distress syndrome [ARDS] and infant respiratory distress syndrome) and has been associated with the use of hypothermia and permissive hypercapnia (controlled hypoventilation). Other indications are diabetic and renal acidosis, salicylate and barbiturate intoxication, and increased intracranial pressure (ICP) associated with brain trauma. It is used in cardioplegic solutions and during liver transplantation. Despite these attributes, THAM has not been documented clinically to be more efficacious than sodium bicarbonate.

Hypothermia

Hypothermia can be both beneficial and detrimental. A fundamental knowledge of hypothermia is of vital importance in the care of surgical patients. The beneficial aspects of hypothermia are mainly a result of decreased metabolism. Injury sites are often iced, creating vasoconstriction and decreasing inflammation through decreased metabolism. This concept of cooling to slow metabolism is also the rationale behind using hypothermia to decrease ischemia during cardiac, transplant, pediatric, and neurologic surgery. Also, amputated extremities are iced before reimplantation. Cold water near-drowning victims have higher survival rates, thanks to preservation of the brain and other vital organs. The Advanced Life Support Task Force of the International Liaison Committee of Resuscitation now recommends cooling (to 32°C–34°C for 12–24 hours) of unconscious adults who have spontaneous circulation after out-of-hospital cardiac arrest caused by ventricular fibrillation. Induced hypothermia is vastly different from spontaneous hypothermia, which is typically from shock, inadequate tissue perfusion, or cold fluid infusion.

Medical or accidental hypothermia is vastly different from trauma-associated hypothermia ( Table 4.2 ). The survival rates after accidental hypothermia range from about 12% to 39%. The average temperature drop is to about 30°C (range, 13.7°C–35.0°C). That lowest recorded temperature in a survivor of accidental hypothermia (13.7°C, or 56.7°F) was in an extreme skier in Norway; she was trapped under the ice and eventually fully recovered neurologically.

Table 4.2
Classification of hypothermia.
Trauma Accidental
Mild 36–34°C 35–32°C
Moderate 34–32°C 32–28°C
Severe <32°C (<90°F) <28°C (<82°F)

The data in patients with trauma-associated hypothermia differ. Their survival rate falls dramatically with their core temperature, reaching 100% mortality when it reaches 32°C at any point—whether it is in the emergency department, operating room, or intensive care unit (ICU). In trauma patients, hypothermia is due to shock and is thought to perpetuate uncontrolled bleeding because of the associated coagulopathy. Trauma patients with a postoperative core temperature below 35°C have a fourfold increase in death; below 33°C, a sevenfold increase in death. Hypothermic trauma patients tend to be more severely injured, older, and have increased blood loss requiring increased number of transfusions.

Surprisingly, in a study using the National Trauma Data Base, Shafi and colleagues showed that hypothermia and its associated poor outcome were not related to the state of shock. It was previously thought that a core temperature below 32°C was uniformly fatal in trauma patients who have the additional insult of tissue injury and bleeding. However, a small number of trauma patients have now survived, despite a recorded core temperature below 32°C. Beilman and coworkers demonstrated that hypothermia was associated with more severe injuries, bleeding, and a higher rate of multiple-organ dysfunction in the ICU, but not with death on multivariate analysis.

To understand hypothermia, we have to remember that humans are homeothermic (warm-blooded) animals, in contrast to poikilothermic (cold-blooded) animals such as snakes and fish. To maintain a body temperature of 37°C, our hypothalamus uses a variety of mechanisms to tightly control core body temperature. We use oxygen as the key ingredient or fuel to generate heat in the mitochondria in the form of ATP. When ATP production is below its lowest threshold, one of the side effects is the lowering of body temperature to the ambient temperature, which typically is less than core body temperature. In contrast, during exercise, we use more oxygen as more ATP is required, and we produce excess heat. In an attempt to modulate core temperature, we start perspiring to use the cooling properties of evaporation.

Hypothermia, although potentially beneficial, is detrimental in trauma patients mainly because it causes coagulopathy. Cold affects the coagulation cascade by decreasing enzyme activity, enhancing fibrinolytic activity, and causing platelet dysfunction. Platelets are affected by the inhibition of thromboxane B 2 production, resulting in decreased aggregation. A heparin-like substance is released, causing diffuse intravascular coagulation–like syndrome. Hageman factor and thromboplastin are some of the enzymes most affected. Even a drop in core temperature of just a few degrees results in 40% inefficiency in some of the enzymes.

Heat affects the coagulation cascade so much that when blood is drawn in cold patients and sent to the laboratory, the sample is heated to 37°C, because even 1 or 2 degrees of cold delays clotting and renders test results inaccurate. Thus, in a cold and coagulopathic patient, if the coagulation profile obtained from the laboratory shows an abnormality, the result represents the level of coagulopathy if the patient (and not just the sample) had been warmed to 37°C. Therefore, a cold patient is always more coagulopathic than indicated by the coagulation profile. A normal coagulation profile does not necessarily represent what is going on in the body.

Heat is measured in calories. One calorie is the amount of energy required to raise the temperature of 1 mL of water (which has, by definition, a specific heat of 1.0). It takes 1 kcal to raise the temperature of 1 L of water by 1°C. If an average man (weight, 75 kg) consisted of pure water, it would take 75 kcal to raise his temperature by 1°C. However, we are not made of pure water, and blood has a specific heat coefficient of 0.87. Thus, the human body as a whole has a specific heat coefficient of 0.83. Therefore, it actually takes 62.25 kcal (75 kg × 0.83) to raise body temperature by 1°C. If a patient were to lose 62.25 kcal, body temperature would drop by 1°C. This basic science is important in choosing methods to retain heat or to treat hypothermia or hyperthermia. It allows one to compare the efficacy of one method with another.

The normal basal metabolic heat generation is about 70 kcal/hr. Shivering can increase this to 250 kcal/hr. Heat is transferred to and from the body by contact or conduction (as in a frying pan and Jacuzzi), air or convection (as in an oven and sauna), radiation, and evaporation. Convection is an extremely inefficient way to transfer heat as the air molecules are so far apart compared with liquids and solids. Conduction and radiation are the most efficient ways to transfer heat. However, heating the patient with radiation is fraught with inconsistencies and technical challenges, and thus it is difficult to apply clinically, so we are left with conduction to transfer energy efficiently.

Warming or cooling through manipulation of the temperature of IV fluids is useful as it uses conduction to transfer heat. Although IV fluids can be warmed, the U.S. Food and Drug Administration (FDA) allows fluid warmers to be set at a maximum of 40°C. Therefore, the differential between a cold trauma patient (34°C) and warmed fluid is only 6°C. Thus, 1 L of warmed fluids can transfer only 6 kcal to the patient. As previously calculated, one needs about 62 kcal to raise the core temperature by 1°C. Therefore, we need 10.4 L of warmed fluids to raise the core temperature by 1°C to 35°C. Once that has been achieved, the differential is now only 5°C between the patient and the warmed fluid, so it actually takes 12.5 L of warmed fluids to raise the patient from 35°C to 36°C. A cold patient at 32°C needs to be given 311 kcal (75 kg × 0.83) to be warmed to 37°C. Note that a liter of fluid must be given at the highest rate possible because, if the infusion rate is slow, it cools to room temperature as the IV line is exposed to ambient room temperature. To avoid IV-line cooling, devices that warm fluids up to the point of insertion into the body should be used.

Warming of patients by infusion of warmed fluids is difficult, but fluid warmers are still critically important; the main reason to warm fluids is to prevent patients from being cooled further. Cold fluids can cool patients quickly. The fluids that are typically infused are at either room temperature (22°C) or 4°C if the fluids were refrigerated. The internal temperature of a refrigerator is 4°C, and this is where PRBCs are stored. Therefore, it takes 5 L of 22°C fluid or 2 L of cold blood products to cool a patient by 1°C. Again, the main reason for using fluid warmers is not necessarily to warm patients but to prevent cooling them during resuscitation.

Rewarming techniques are classified as passive or active. Active warming is further classified as external or internal ( Table 4.3 ). Passive warming involves preventing heat loss. An example of passive warming is drying the patient to minimize evaporative cooling, giving warm fluids to prevent cooling, or covering the patient so that the ambient air temperature immediately around the patient can be higher than the room temperature. Covering the patient’s head helps reduce a tremendous amount of heat loss. Aluminum-lined head covers are preferred; they reflect back the heat that is normally lost through the scalp. Warming of the room technically helps reduce the heat loss gradient, but the surgical staff is usually unable to work in a humidified room of 37°C. Preventing evaporative heat loss also includes closing an open body cavity, such as the chest or abdomen. The most important way to prevent heat loss is to treat hemorrhagic shock by controlling bleeding. Once shock has been treated, metabolism will heat the patient from his or her core. This point cannot be overemphasized.

Table 4.3
Classification of warming techniques.
Passive Active External Active Internal
Dry the patient Bair hugger Warmed fluids
Warm fluids Heated warmers Heat ventilator
Warm blankets and sheets Lamps Cavity lavage, chest tube, abdomen, bladder
Provide head covers Radiant warmers Continuous arterial or venous rewarming
Warm the room Clinitron bed Full or partial bypass

Active warming is the act of transferring calories to the patient, either externally through the skin or internally. Skin and fat are designed to be highly efficient in preventing heat transfer. Whereas the fat is insulating against loss of heat, it is also the reason that transfer of heat past the skin is difficult. Active external warming is thus inefficient because of our built-in insulation compared with internal warming. The first and most important step for active rewarming is to remove any wet clothes or bedding that is present and dry the patient prior to starting any active warming technique. Without this step, the efficiency of all methods will drop dramatically, and the importance of this step cannot be overstated. External active warming with forced-air heating, such as with Bair Hugger temperature management therapy (Arizant Healthcare Inc., Eden Prairie, MN), is technically classified as active warming, but since air is a terribly inefficient medium, so few calories are provided to patients. Forced-air heating increases only the patient’s ambient temperature, but it can actually cool the patient initially because it increases evaporative heat loss if the patient is wet from blood, fluids, clothes, or sweat. Warming the skin may feel good to the patient and the surgeon, but it actually decreases shivering (a highly efficient method of internal warming that tricks the thermoregulatory nerve input on the skin). Because forced-air heating uses convection, the actual amount of active warming is estimated to be only 10 kcal/hr.

Active external warming is more efficiently performed by placing patients on heating pads, which use conduction to transfer heat. Beds are available that can warm patients faster, such as the Clinitron bed (Hill-Rom, Batesville, IN), which uses heated air-fluidized beads. Such beds are not practical in the operating room but are applicable in the ICU. Another option is the use of heating pads, which use heated water for countercurrent heat exchange. These can be placed under the patient during surgery and can be effective in minimizing mild hypothermia. The number of kilocalories per hour depends on the extent of dilatation or vasoconstriction of the blood vessels in the skin. This countercurrent heat exchange system can also be used to cool the patient if necessary.

The best method to warm patients is to deliver calories internally ( Table 4.4 ). Heating the air used for ventilators is technically a form of internal active warming, but, again, is an inefficient method as this transfers heat via convection. The surface area of the lungs is massive, but the energy is mainly transferred through humidified water droplets, mostly by convection and not conduction. The amount of heat transferred through warmed humidified air is also minimal by comparison to methods that use conduction. One method by which this can be done is the lavage of warmed fluids into body cavities via nasogastric tubes, Foley catheters, chest tubes, or lavage of the peritoneal cavity. If gastric lavage is desired, one method is continuous lavage by infusion of warmed fluids through the sump port while the fluid is sucked out of the main tube. Instruments to warm the hand through conduction show much promise but are not yet readily available.

Table 4.4
Calories delivered by active warming.
Method Kcal/hr
Airway from vent 9
Overhead radiant warmers 17
Heating blankets 20
Convective warmers 15–26
Body cavity lavages 35
Continuous arteriovenous rewarming 92–140
Cardiopulmonary bypass 710

A method that can actively rewarm a patient and also assist in the treatment of shock is extracorporeal membrane oxygenation (ECMO). With ECMO, the patient’s blood is pumped through an artificial lung and then back into the bloodstream, which can support either a failing pulmonary or cardiac system. Along with oxygenation in the artificial lung, the blood can also be warmed and then returned to the patient. Recent literature also shows promise in using ECMO for rewarming after accidental hypothermia through case reports and cost-effectiveness analysis. , Cardiopulmonary bypass can also be used as it delivers heated blood at a rate of more than 5 L/min to every place in the body where there are capillaries. If full cardiopulmonary bypass is not available or not desired, alternatives include continuous venous or arterial rewarming. Venous-venous rewarming can also be accomplished using the roller pump of a dialysis machine (which is often more available to the average surgeon). A prospective study showed arterial-venous rewarming to be highly effective. It can warm patients to 37°C in about 39 minutes compared with an average warming time of 3.2 hours with standard techniques. Special Gentilello arterial warming catheters are inserted into the femoral artery, and a second line is inserted into the opposite femoral vein. The pressure from the artery produces flow, which is then directed to a fluid warmer and back into the vein. This method depends highly on the patient’s BP as flow is directly related to BP. There are also commercially available central line catheters that directly heat the blood; a countercurrent exchange system heats the tip of the catheter with warmed fluids, and as blood passes over this warmed catheter, heat is directly transferred.

During recent decades, with the changes in resuscitation methods, the incidence of hypothermia has decreased. Dilutional coagulopathy also occurs less frequently as the volume of crystalloids has been minimized and particular attention has been paid to ensure that all resuscitation fluids and blood are warmed before infusion.

Coagulopathy

Coagulopathy in surgical patients is multifactorial. In addition to acidosis and hypothermia, the other usual cause of coagulopathy is decreased clotting factors. This decrease can be caused by consumption (from the innate attempt to stop bleeding), dilution (from infused fluids devoid of clotting factors), and genetic (hemophilia) factors.

Coagulopathy often needs to be corrected. The most commonly used tests for coagulopathy are prothrombin time, partial thromboplastin time, and international normalized ratio. However, these tests have been shown to be inaccurate in detecting coagulopathy in surgical patients. One of the major reasons is that coagulopathy is a dynamic state that evolves through different stages of hypocoagulability, hypercoagulability, and fibrinolysis. The traditional tests of blood clotting lack the ability to detect the evolution of coagulopathy through these stages as they only depict the coagulation state at a snapshot in time. Moreover, the traditional tests are performed at normal pH and temperature, so they do not consider the effects of hypothermia and acidosis on coagulation. These traditional tests are also performed on serum and not on whole blood and do not have the ability to measure the interaction of coagulation factors and platelets.

More recently, thromboelastography (TEG) and rotational thromboelastometry have emerged as dynamic measures of coagulation that provide a more sensitive and accurate measure of the coagulation changes seen in surgical patients. TEG and thromboelastometry are based on similar principles of detecting clot strength, which is the final product of the coagulation cascade. They are also performed on whole blood, so they consider the functional interaction of coagulation factors and platelets.

TEG parameters include R-time, reaction time; α, alpha angle; and MA, maximum amplitude. The R-time reflects the latent time until fibrin formation begins. An increase in this time may result from either decreased activity or deficiencies of coagulation factors, whereas a decrease in R-time reflects a hypercoagulable state. The steepness of the α-angle reflects the rate of fibrin formation and cross-linking with a sharper angle indicating increased fibrin formation and a flatter angle indicating slower formation. The measure of clot strength is MA, which reflects clot elasticity. The value of MA is a measure of the strength of interaction between the coagulation factors, fibrin, and platelets. Qualitative or quantitative defects in either of these would result in decreased MA. TEG provides the additional ability to measure the fibrinolytic arm of the coagulation cascade. LY30 and LY60 indices provide a measure of the fibrinolysis rate by calculating the decrease in clot strength at 30 and 60 minutes, respectively. A large lysis index reflects rapid fibrinolysis and may help guide the use of antifibrinolytic therapy, which has been shown to reduce mortality if it is used within 3 hours of injury, in these patients. Components of a TEG can help guide treatment of a bleeding patient as they can give the surgeon information regarding what part of the clotting cascade is defective. These tests are routinely used in cardiac surgery and are becoming more popular in trauma and liver transplant surgery in the form of point-of-care testing, but they are not widely available in most hospitals ( Fig. 4.7 ).

Fig. 4.7, Coagulation and fibrinolysis testing.

The methods to define and to treat coagulopathy are still varied. As discussed earlier, stopping the lethal triad is the most important step to stop the vicious cycle of hemorrhage. Prothrombin complex concentrate (PCC) has become popular for the treatment of surgical coagulopathy. PCC actually has many factors (factors II, VII, IX, X) in it, including variable amounts of factor VIIa, depending on the brand of PCC used. For patients taking warfarin, PCC is the recommended treatment of choice as this treatment replaces the factors lost with warfarin. This is of particular benefit in elderly patients with TBI, in whom treatment with fresh frozen plasma (FFP) can potentially be a problem if the patient has comorbid cardiac disease and could induce cardiac heart failure from volume overload. Additional benefit of using PCC is that the time to reversal of coagulopathy is shorter than when FFP is used. The use of blood-based component therapy is paramount in treating coagulopathy (see later, “Evolution of Modern Resuscitation”). PCC has been studied in trauma patients at risk for bleeding with promising results, but the studies are not randomized and require further confirmation. If there were a drug that would stop or reduce bleeding, treat coagulopathy at a low cost, and not cause serious complications, it would be a landmark contribution to medicine. Again, the problem is that current candidates are expensive, and the adverse events from administering such a drug are still not fully elucidated.

Another target of the coagulation cascade for medications is modulating the fibrinolytic pathways. Tranexamic acid (TXA) is a synthetic analogue of the amino acid lysine and is an antifibrinolytic medication that competitively inhibits the activation of plasminogen to plasmin. It prevents degradation of fibrin, which is a protein that forms the framework of blood clots. TXA has about eight times the antifibrinolytic activity of an older analogue, ε-aminocaproic acid. It is used to treat or to prevent excessive blood loss during cardiac, liver, vascular, and orthopedic surgical procedures. It seems that topical TXA is effective and safe after total knee and hip replacement surgery along with mucosal oropharyngeal bleeding in patients who are thrombocytopenic, reducing bleeding and the need for blood transfusions. Studies have shown similar results in children undergoing craniofacial surgery, spinal surgery, and others. It is even used for heavy menstrual bleeding in oral tablet form and in dentistry as a 5% mouthwash. It has been advocated for use in trauma. It seems to be effective in reducing rebleeding in spontaneous intracranial bleeding. A small double-blinded, placebo-controlled, randomized study of 238 patients showed a reduction in progression of intracranial bleeding after trauma, but, because of the small sample size, it was not statistically significant. TXA is used to treat primary fibrinolysis, which is integral in the pathogenesis of the acute coagulopathy of trauma.

The Clinical Randomization of an Antifibrinolytic in Significant Hemorrhage (CRASH-2) trial, a multicenter randomized controlled civilian trial of 20,211 patients, showed that TXA reduced all-cause mortality versus placebo (14.5% vs. 16.0%). The risk of death caused by bleeding was also reduced (4.9% vs. 5.7%). CRASH-2 also suggested that TXA was less effective and could even be harmful if treatment was delayed more than 3 hours after admission. This was confirmed in the retrospective Military Application of Tranexamic Acid in Trauma Emergency Resuscitation (MATTER) study and rapidly incorporated into military practice guidelines and subsequently for civilians worldwide. The PED-TRAX study demonstrated that in children treated at a military hospital in Afghanistan, TXA administration to 66 of the 766 children was independently associated with decreased mortality and improved neurologic and pulmonary outcomes. Although the CRASH-2 trial was a randomized study with placebo, the critics of the study point out that it was performed in 270 hospitals in 40 countries, and the large sample size may result in a beta 1 error, meaning that the study was statistically significant because of the large number of patients in the study, but the small differences in outcome may not necessarily be clinically relevant. The absolute risk reduction was approximately 1.5% with an estimated number needed to treat of 68. The CRASH-3 trial is currently being conducted to assess the effect of TXA on risk of death or disability in patients with TBI. The key will be dosing, timing, and patient selection. The drug is attractive because it is inexpensive ($5.70 per dose) and easy to use with seemingly minimal side effects.

Oxygen Delivery

The definition of shock is inadequate tissue perfusion, but many clinicians have incorrectly simplified it to inadequate tissue oxygenation. Much of what we know about oxygen delivery and consumption started with a physiologist named Archibald V. Hill. He was an avid runner who measured the oxygen consumption of four runners running around an 88-m grass track ( Fig. 4.8 ). In the process of his work, Hill defined the terms maximum O 2 intake , O 2 requirement , and O 2 debt . He is mostly known for his work with Otto Meyerhof, who unraveled the distinction between aerobic and anaerobic metabolism, for which they were awarded the Nobel Prize in 1922.

Fig. 4.8, Bag with side tube, low on the left-hand side, for use while running. The tap is carried in the left hand.

Blood delivers oxygen by red cells, which contain hemoglobin. The simple calculation of oxygen delivery (D o 2 ) is the cardiac output (CO) multiplied by the content of oxygen carried by a volume of blood:(Ca o 2 ): D o 2 = CO × Ca o 2

The average hemoglobin molecule carries 1.34 mL of oxygen per gram, depending on the arterial hemoglobin (Hgb) oxygen saturation (Sa o 2 ) of the red cell. In addition, a minor amount of oxygen is dissolved in plasma. This amount is calculated by multiplying the solubility constant 0.003 times the partial pressure of oxygen in the arterial blood (Pa o 2 ). The Ca o 2 of arterial blood is calculated as follows: Ca o 2 = (1.34 × Hgb × Sa o 2 ) + (0.003 × Pa o 2 )

where hemoglobin is in grams per deciliter. Cardiac output is heart rate multiplied by the stroke volume. In a normal state, the stroke volume can be increased by shunting blood from one tissue bed to the central vasculature, but most of the change in cardiac output is due to increased heart rate. In states of hemorrhage and resuscitation, the stroke volume is affected by infusion of fluids. As blood volume is decreased, it will ultimately affect stroke volume and is compensated by an increase in heart rate.

Oxygen consumption (V o 2 ) by cells is calculated by subtracting the content of oxygen in the venous system (Cv o 2 ) from delivered oxygen content in the arterial blood (Ca o 2 ): V o 2 = C O × (Ca o 2 – Cv o 2 )

After simplifying the terms and converting the units, the result is as follows: V o 2 = C O × 1.34 × Hgb × (Sa o 2 – Sv o 2 )

The most conventional method of sampling the venous oxygen content is by drawing blood from the most distal port of a pulmonary artery catheter. The sample is taken from the pulmonary artery because venous blood is mixed there from all parts of the body. Oxygen content in the inferior vena cava is typically higher than in the superior vena cava, which is higher than in the coronary sinus. The average mixed venous sample is 75% saturated, so the oxygen consumption is thought to be, on average, 25% of the oxygen delivered ( Fig. 4.9 ). Thus, teleologically, there is ample reserve of oxygen delivered.

Fig. 4.9, Oxygen delivery (D o 2 ) and consumption (V o 2 ) . During normal states, oxygen delivery is approximately 1000 mL/min of O 2 . The oxygen consumption in a normal state is 25% of delivery and is approximately 250 mL/min. At very low oxygen delivery, it is believed that consumption is delivery dependent and occurs in shock. There is oxygen debt during shock and during recovery, and there is a hyperdynamic stage during which the circulatory system is paying back its oxygen debt.

With advancements in technology, catheters are now available that can continuously measure the venous saturation in the pulmonary artery. These use technology similar to the pulse oximeter built into the tip of a pulmonary artery catheter, which uses near-infrared (NIR) light waves to measure the oxygen saturation state of hemoglobin. These advanced catheters can also provide cardiac output continuously. In the past, cardiac output was inferred by measuring the rate of change in temperature in the heart at the distal aspect of a pulmonary artery catheter by infusing a standard volume of iced or room temperature water into the proximal port and measuring the change in temperature. In recent years, pulmonary artery catheters are no longer commonly used. Venous oxygenation can still be measured from central lines, but these assessments are no longer mixed venous in nature.

Cardiac output and oxygen delivery are also affected by the end-diastolic volume of the left ventricle. As described by Starling in 1915, cardiac output increases when the ventricular fibers increase in length. There is a maximum filling point, then cardiac output no longer increases ( Fig. 4.10 ). Left ventricular end-diastolic (LVED) volume can be inferred by using a pulmonary artery catheter and measuring the wedge pressure, which represents preload. This reflects the pressure in the left ventricle because the vessels from the pulmonary artery to the left ventricle have no valves. Alternative approaches can help optimize the filling volume in the left ventricle. Pulmonary artery catheters for calculating the right ventricular end-diastolic volume are available but are rarely used. Echocardiography using transthoracic or esophageal probes can directly estimate filling volumes in the heart. However, variations in volume and heart size can distort results. Heart size is also affected by medical conditions that can stress and dilate the heart. The interpretation of heart size and adequate resuscitation data is thus subjective.

Fig. 4.10, Starling curve. As left ventricular end-diastolic (LVED) pressure is increased, the fibers of the heart muscle are lengthened, resulting in increased contraction and increased cardiac output. This occurs to a certain point, at which increases in volume and length do not result in increases in cardiac output.

Optimization (Supernormalization)

During the late 1980s, surgical critical care evolved into a specialty, focusing heavily on ventilator support and optimizing oxygen delivery to tissues. One of the pioneers of modern surgical critical care, William Shoemaker, theorized that during shock, because of a lack of oxygen delivery, there was anaerobic metabolism and an oxygen debt that needed to be repaid. He showed that after volume loading, if oxygen delivery increased, consumption would also increase—until a certain point, when an additional increase in oxygen delivery did not result in increased consumption. This increased oxygen consumption was thought to be the process of paying back the oxygen debt that occurred during ischemia throughout the body. Patients in shock were found to have a hyperdynamic stage in which increased oxygen delivery resulted in increased consumption. The assumption was that increased consumption was replenishing the oxygen debt that the body had incurred.

Shoemaker popularized the concept of optimization or supernormalization of oxygen delivery, which means that oxygen delivery is maximized or increased until its consumption no longer increases but instead levels off (flow independence). The optimization process involved administering a rapid bolus of fluid and confirming that it raised wedge pressure. Because the response to fluid infusion was dynamic, the infusion process had to occur during a short period, such as 20 minutes. If it took longer, changes in the vascular space and specifically the heart may be due to other variables in addition to the fluids used. Also, if the response was not measured immediately after infusion, the effect of the infusion was known to degrade quickly as fluids moved out of the vascular space. Wedge pressure and cardiac output must be measured minutes before fluid infusion to determine whether it is effective. If cardiac output increases with the wedge pressure increase, it is assumed that oxygen delivery increases. By sampling the central venous oxygen content when measuring cardiac output, clinicians can determine whether oxygen consumption also increases. This process was originally repeated until it was demonstrated that the fluid bolus did not increase cardiac output. The goal was to optimize oxygen delivery from the delivery-dependent portion of the curve to the portion that was not delivery dependent ( Fig. 4.9 ).

The preferred fluid during the optimization process was LR solution as it was inexpensive and thought to be innocuous. Once the Starling curve was optimized, in that LVED volume could no longer be increased with increases in wedge pressure (preload), wedge pressure would be kept at that maximal level. Further increases in wedge pressure without increasing LVED volume meant that patients might suffer from unnecessary pulmonary edema.

Once fluid infusion maximized cardiac output and oxygen delivery, an inotropic agent would be added to further push cardiac output to a higher level. The agent recommended at that time was dobutamine. The dose was increased, and its effect on cardiac output was documented. With each maneuver, oxygen consumption was measured and cardiac output “optimized” to meet the consumption demands. This optimizing process maximized oxygen delivery to ensure that all tissue beds were being fed adequately. Shoemaker’s earlier clinical trials had shown that patients resuscitated in this manner had a lower incidence of multiple-organ dysfunction syndrome (MODS) and death. During this optimizing era, ARDS and MODS were the leading causes of late death in trauma patients.

However, subsequent clinical studies failed to repeat Shoemaker’s success. Randomized prospective trials showed that the optimization of oxygen delivery and consumption did not improve outcome. In general, patients who responded to the optimization process did well, but those who could not have their oxygen delivery augmented to a higher level did poorly. Thus, although response to optimization was prognostic of outcome, the process itself did not seem to change outcome. One of the reasons that the earlier studies succeeded may have been because the control patients were not adequately resuscitated. With the later trials, when patients were adequately resuscitated, the optimization process did not improve outcome. In fact, the aggressive use of fluids to achieve supranormal oxygen delivery could cause increased multiple-organ failure, abdominal compartment syndrome, and increased mortality from excessive crystalloid infusions. Over time, the widely used pulmonary artery catheter fell out of favor. Studies have shown that the discontinued use of the pulmonary artery catheter has not adversely affected outcome. Due to the invasive nature of the pulmonary artery catheter and concern that the data derived from the catheter were often misinterpreted, its use has virtually disappeared from the modern-day surgical ICU aside from the use in cardiac surgery patients.

Moreover, oxygen delivery in hyperdynamic patients could not be driven to a point at which consumption seemed to level off. One theory was that as the heart was being pushed with the supernormalization process, the heart’s metabolism increased such that it was the major organ seemingly consuming all of the excess oxygen being delivered. The harder the heart worked to deliver the oxygen, the more it had to use. Normal cardiac output for an average man is about 5 L/min, yet patients were often driven to a cardiac output of 15 L/min or more for days at a time.

The critics of the optimization process asserted that there was a point during oxygen delivery when it was flow dependent, but the coupling of consumption and delivery made it seem like increased delivery was the factor that increased consumption. Furthermore, optimization advocates neglected the fact that the body was usually already at the flat part of the oxygen consumption curve. Rarely was oxygen delivered when it was critical or when the body was consuming all that was being delivered. The result of the optimization process usually meant that patients were flooded with fluids. The hyperdynamic response and MODS may have resulted from the fluids used, which may have caused an inflammatory response at excessive volumes.

The concept of oxygen debt, introduced by the physiologist Archibald Hill almost 100 years ago, may have some vital flaws in it. His original work on aerobic and anaerobic metabolism in just four patients has now been propagated for a century. However, modern exercise physiology studies have shown that oxygen debt is repaid during a short period; it does not take days. In contrast, the optimization process showed oxygen debt for long periods.

During massive hemorrhage, ischemia to some tissues is theoretically possible. However, in acute hemorrhage, when the BP falls to 40 mm Hg, cardiac output and thus oxygen delivery are typically reduced by only 50%. Before resuscitation with acellular fluids, the hemoglobin level does not fall significantly. In this state, oxygen delivery is cut by only half, and the body is designed to have plenty of reserves (cells consume only 25% of the delivered oxygen in the normal state). Whether any ongoing anaerobic metabolism is actually occurring is questionable, as the oxygen delivery has to fall to 25% of baseline to theoretically be anaerobic. When resuscitation takes place without blood to restore the intravascular volume, the hemoglobin level theoretically may fall by 50%, but cardiac output is generally restored to the original state. Again, oxygen delivery is only halved, with plenty of oxygen still being delivered to avoid ongoing anaerobic metabolism. It is difficult to reduce cardiac output and hemoglobin level to a level at which oxygen delivery is reduced by 75%, that is, to below the anaerobic threshold.

In hypovolemic shock states, it was thought that, even though global oxygen delivery may be adequate, regional hypoxia is ongoing. Different organs and tissue beds are not similar in their oxygen needs or consumption. Hypoxic insult may be experienced by the critical organs, whose flow is usually preserved, whereas nonessential organs are sacrificed in terms of oxygen delivery. Yet, such patients are not actively moving, and their oxygen demand is minimal. Thus, the theory of oxygen debt is in question. In exercise states, even if there is oxygen debt, it is paid back quickly and does not take days.

To optimize oxygen delivery, one of the most efficient ways, according to past calculations, was to add hemoglobin. If the hemoglobin level increased from 8.0 to 10 g/dL, by transfusing two units of blood, oxygen delivery would increase by 25%. Blood transfusions were part of the optimization process because they also increased wedge pressure and LVED volume and thus cardiac output, but it was rarely noted that transfusions placed patients on the flat part of the consumption curve ( Fig. 4.9 ).

Decades ago it was also thought that an increased hematocrit would reduce flow in the capillaries, so clinicians had reservations about transfusing too much blood. Studies in the 1950s demonstrated better flow at the capillary level with diluted blood. However, the small amount of decreased flow with the higher viscosity was in the range of a few percentage points and did not compare to the 25% increase in oxygen delivery with a transfusion of a couple of units of PRBCs. Blood transfusions by calculations would be the most efficient way of increasing oxygen delivery, if that were the goal.

Current exercise physiology studies have shown that professional athletes perform better when their hemoglobin levels are above normal. The athletes who blood dope, by undergoing autologous blood transfusions or by taking red cell production enhancers such as erythropoietin or testosterone, are now banned for illegal performance enhancement. Such athletes have cardiac outputs of more than 20 to 50 L/min. They do not seem to have any problems with blood sludging from the higher flow and more viscous blood than normal. The argument against this analogy of athletes and their capability to deliver oxygen despite a high hematocrit level is that injured patients have capillaries that are not vasodilated and are often plugged with white and red cells.

Global Perfusion Versus Regional Perfusion

Gaining the ability to measure BP was revolutionary. However, because the main functions of the vascular system are to deliver needed nutrients and to carry out excreted substances from the cells, clinicians constantly ask whether BP or flow is more important. During sepsis, systemic vascular resistance is low. A malfunction somewhere in the autoregulatory system is assumed.

A teleologic explanation, however, is possible. Lower systemic vascular resistance could be a way our body evolved so that cardiac output can be easily increased as afterload is reduced. Some shunting is believed to occur at the capillary level; however, an important question is, “Should BP be augmented with exogenous administration of pressor agents, normalizing BP at the expense of capillary flow?” High doses of pressor agents most likely worsen flow because lactate levels rise if the pressor dose is too high. That rise in lactate levels could be caused by a stress response as catecholamines are known to increase lactate levels, or it could also be caused by decreased flow at the capillary bed.

Purists would prefer to have lower pressure as long as flow is adequate, but some organs are somewhat sensitive to pressure. For example, the brain and kidneys are traditionally thought to be pressure dependent; however, when early experiments were done, it was difficult to isolate flow from pressure due to the interrelation of those two values. With the concept that flow might be more important than just pressure, technology developed to focus on measuring flow of nutrients rather than pressure.

During hemorrhage or hypovolemia, blood is redirected to organs such as the brain, liver, heart, and kidneys—at the expense of tissue beds such as the skin, muscle, and gut. Thus, the search ensued to find the consequences of this shunting process. The gastrointestinal (GI) tract became the focus of much of this research. Two main methods were developed, gastric tonometry and NIR technology.

Gastric tonometry measures the adequacy of blood flow in the GI tract through placement of a CO 2 -permeable balloon filled with saline in the stomach of a patient after gastric acid suppression. The balloon is left in contact with the mucosa of the stomach for 30 minutes, allowing the CO 2 of the gastric mucosa to pass into the balloon and equilibrate. The saline and gas are then withdrawn from the balloon, and the partial pressure of the CO 2 is measured. That value, in conjunction with the arterial HCO 3 , is used in the Henderson-Hasselbalch equation to calculate the pH of the gastric mucosa and, by inference, to determine the adequacy of blood flow to the splanchnic circulation.

The logistic difficulties of gastric tonometry are concerning. Data on its use have suggested that even though it can help predict survival, resuscitating patients to an improved value had no survival benefit. Most clinicians have now abandoned gastric tonometry. A multicenter trial showed that in patients with septic shock, gastric tonometry was predictive of outcome, but implementing this technology was no better than using the cardiac index as a resuscitation goal. Regional variables of organ dysfunction are thought to be better monitoring variables than global pressure-related hemodynamic variables. However, the data seem to consistently indicate that initial resuscitation of critically ill patients with shock does not require monitoring of regional variables. After stabilization, regional variables are, at best, merely predictors of outcome rather than goals that should be targeted.

The optimal device for monitoring the adequacy of resuscitation should be noninvasive, simple, cheap, and portable. NIR spectroscopy uses the NIR region of the electromagnetic spectrum from about 800 nm to 2500 nm. Typical applications are wide ranging: physics, astronomy, chemistry, pharmaceuticals, medical diagnostics, and food and agrochemical quality control. The main attraction of NIR is that light at those wavelengths can penetrate skin and bone. This is why your hand looks red when it is placed over a flashlight; the other visible light waves are absorbed or reflected, but red light and infrared light pass through skin and bone readily.

A common device using NIR technology that has now become standard in the medical industry is the pulse oximeter. Using slightly different light waves, it yielded correlations with such variables as the cytochrome aa 3 status by adding a third light wave in the 800-nm region. When the oxygen supply is less than adequate, the rate of electron transport is reduced and oxidative phosphorylation decreases, leading ultimately to anaerobic metabolism. Optical devices that use NIR wavelengths can determine the redox potential of copper atoms on cytochrome aa 3 and have been used to study intracellular oxidative processes noninvasively. Thus, with NIR technology, the metabolic rate of tissue can be directly determined to assess whether it is being adequately perfused. Animal models of hemorrhagic shock have validated the potential use of NIR technology in that they showed changes in regional tissue beds ( Fig. 4.11 ). The superiority of NIR results over conventional measurements of shock has been shown in animal and human studies.

Fig. 4.11, Cytochrome aa3 measurements in rabbits during hemorrhagic shock. Shown are regional tissue beds and implied tissue oxygenation. Oxygenation at the mitochondrial level is preserved in kidney and liver compared with muscle and stomach.

To test the utility of this potentially ideal monitoring device, a multicenter prospective study was conducted to determine whether NIR technology could detect patients at risk of hemorrhagic shock and its sequelae. Performed in seven level I trauma centers, the study enrolled 383 patients who were in severe traumatic shock with hypotension and who required blood transfusions. A probe similar to a pulse oximeter was placed on the thenar muscle of patients’ hands, continuously gathering NIR values. The NIR probe was found to be as sensitive as base deficit in predicting death and MODS in hypotensive trauma patients. The receiver operating characteristic curves shows that it also may be somewhat better than BP in predicting outcome. More importantly, the negative predictive value was 90% ( Fig. 4.12 ). The noninvasive and continuous NIR probe was able to demonstrate perfusion status. Note, however, that MODS developed in only 50 patients in that study. This was probably because the method of resuscitating trauma patients changed during this period, and this reduced MODS and death rates. The changes that took place are discussed later in this chapter but, in brief, were due to damage control resuscitation.

Fig. 4.12, Near-infrared (NIR) spectroscopy in 383 patients with traumatic hemorrhagic shock with hypotension who required blood transfusion. NIR measured tissue oxygenation levels in the thenar muscle noninvasively and was found to correlate well with arterial base deficit.

NIR technology may be able to show when a patient is in shock or even when a patient is doing well. Occult hypoperfusion can be detected or even ruled out reliably with NIR. In the trauma setting, a noninvasive method that can continuously detect trends in parameters such as regional oxygenation status, base deficit, or BP will surely find a role. Will this technology change how patients are treated? The debate now centers on this issue and raises some questions. Once a patient’s hypoperfusion status has been determined, whether by BP, NIR technology, or some other device, what should we do with that information? Is it necessary to increase oxygen delivery to regional tissue beds that are inadequately oxygenated? Previous studies have shown that optimizing global oxygen delivery is not useful and that regional tissue monitoring with gastric tonometry has also failed to show benefit, so will NIR technology be helpful or harmful? An example of harm is over resuscitating a patient to fix an abnormal value that may or may not be clinically relevant. The end point of resuscitation is constantly being debated. Since NIR results correlate well with base deficit, we may one day use NIR technology to infer the base deficit value indirectly.

NIR technology has other promising uses in surgery, such as direct monitoring of flow and tissue oxygenation in high-risk patients (e.g., in those undergoing organ transplantation; for free flap perfusion; for classification of burn injuries; in intraoperative assessment of bowel ischemia; with compartment syndrome; or even subdural and epidural hematomas). Perhaps the most useful application will be in the ICU in septic shock patients at risk for multiple-organ failure.

Septic Shock

In 2001, Rivers and colleagues reported that among patients with severe sepsis or septic shock in a single urban emergency department, mortality was significantly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy than among those who were given standard therapy (30.5% vs. 46.5%). The premise was that usual care was not aggressive or timely. Early goal-directed therapy addressed this as it called for central venous catheterization to monitor central venous pressure and central venous oxygen saturation, which were used to guide the use of IV fluids, vasopressors, PRBC transfusions, and dobutamine to achieve prespecified physiologic targets. Based on this type of research, the Surviving Sepsis Campaign clinical guidelines were initially published in 2004 and subsequently updated in 2008, 2014, and 2016. The various methods of therapy were graded by a panel of international experts. A randomized prospective study has now shown that protocol-based care for early septic shock does not improve outcome. The newer study was not identical to the original Rivers study as the survival rates were much higher, but this study may just show that the usual therapy may have already adopted many of the principles of early goal-directed therapy, and thus the difference is negligible. This study also found no significant benefit of the mandated use of central venous catheterization and central hemodynamic monitoring in all patients.

In the most recent update of the Surviving Sepsis campaign clinical guidelines published in 2016 there was a major transition in the way that sepsis was defined. Previous iterations of the campaign used SIRS criteria ( Table 4.5 ) to define sepsis in which the patient needed to have two out of four of the criteria present and a source of infection to be deemed to have sepsis. This definition was fraught with problems as any noninfectious process that activated the inflammatory cascade could lead to a similar physiologic picture. Instead of using SIRS criteria, sepsis was now defined as an increase in a patient’s sequential organ failure assessment (SOFA) score by 2 points from baseline ( Table 4.6 ). Since the SOFA score can be cumbersome to calculate at bedside and requires laboratory test results, a simpler version was developed. The qSOFA ( Table 4.7 ) can be calculated at a patient bedside by identifying tachypnea, altered mental status, and hypotension. If the patient meets two of these criteria and they are at risk for sepsis, further workup for an infectious source should be conducted.

Table 4.5
SIRS criteria.
Body Temperature >38°C or <36°C
Heart Rate > 90 Beats per min
Tachypnea Respiratory rate >20/min or PaCO 2 <32 mm Hg
White Blood Cell Count >12,000/mm 3 , <4000/mm 3 , or >10% immature neutrophils
SIRS , Systemic inflammatory response syndrome.

Table 4.6
Sequential organ failure assessment (SOFA) score.
Score
System 0 1 2 3 4
Respiratory Pa o 2 /FiO 2 , mm Hg ≥400 <400 <300 <200 With respiratory support <100 With respiratory support
Coagulation Platelets, x10>3/μL ≥150 <150 <100 <50 <20
Liver Bilirubin, mg/dL <1.2 1.2–1.9 2.0–5.9 6.0–11.9 >12.0
Cardiovascular MAP ≥70 mm Hg MAP <70 mm Hg Dopamine <5 or dobutamine (any dose) Dopamine 5.1–15 or epinephrine ≤0.1 or norepinephrine ≤0.1 Dopamine >15 or epinephrine >0.1 or norepinephrine >0.1
Central Nervous System Glasgow coma scale score 15 13–14 10–12 6–9 <6
Renal Creatinine, mg/dL <1.2 1.2–1.9 2.0–3.4 3.5–4.9 >5.0
Urine output, mL/day <500 <200
FiO 2 , Percentage of inspired oxygen; MAP, mean arterial pressure; Pao 2 , partial pressure of arterial oxygen.

Table 4.7
qSOFA.
Respiratory rate ≥22
Altered mental status
Systolic blood pressure ≤100 mm Hg
SOFA, Sequential organ failure assessment.

Problems with Resuscitation

Lessons learned from the Korean War showed that resuscitation with blood and blood products was useful. Throughout that war, the concept prevailed that a limited amount of salt and water should be given to patients after injuries. By the time of the Vietnam War, volume resuscitation in excess of replacement of shed blood became an acceptable practice. That practice may have been influenced by studies of hemorrhagic shock performed by Tom Shires. In his classic study, Shires used the Wiggers model and bled 30 dogs to a mean BP of 50 mm Hg for 90 minutes. He then infused LR solution (5% of body weight) followed by blood in 10 dogs, plasma (10 mL/kg) followed by blood in another 10 dogs and shed blood alone in the remaining 10 dogs. The dogs that received LR solution had the best survival rate. Shires concluded that, although the replacement of lost blood with whole blood remains the primary treatment of shock, adjunctive replacement of the coexisting functional volume deficit in the interstitium with a balanced salt solution appears to be beneficial. Shires concluded that resuscitation with LR solution should be initiated while whole blood transfusions are being mobilized.

Soon the surgical community went from being judicious with crystalloid solutions to being aggressive. Surgeons returning from the Vietnam War advocated the use of crystalloids, a seemingly cheap and easy method of resuscitating patients, touting that it saved lives. However, what evolved from this method of resuscitation was the so-called Da Nang lung, also known as Shock Lung and eventually ARDS. (The U.S. Navy had its field hospital in Da Nang, Vietnam.) The explanation for the evolution of the new condition was that battlefield patients were now living long enough to develop ARDS because their lives were saved with aggressive resuscitation and better critical care, including a greater capability to treat renal failure.

However, that explanation had no supporting evidence. The killed in action rate (the number of wounded patients who died before reaching a facility that had a surgeon) had not changed for more than a century ( Table 4.8 ). The died-of-wounds rate (the number of wounded patients who died after reaching a facility that had a physician) had decreased during World War II, thanks to the use of antibiotics, but it was slightly higher during the Vietnam War. The perceived reason for slightly higher died-of-wounds rates was that patients in Vietnam were transported to medical facilities much more quickly by helicopters. Transport times did indeed decrease from an average of 4 hours to 40 minutes, but if the sicker patients who would have normally died in the field were transported more quickly to die in the medical facility, then the killed-in-action rate should have fallen—and it did not.

Table 4.8
Mortality rates.
Killed in Action (%) Died of Wounds (%)
Civil War 16.0 13.0
Russo-Japanese War 20.0 9.0
World War I 19.6 8.1
World War II 19.8 3.0
Korean War 19.5 2.4
Vietnam War 20.2 3.5

Moreover, the renal failure rate and the cause of renal failure did not significantly change between the Korean War and the Vietnam War. Another false argument was that the wounds seen during the Vietnam War were worse because of the enemy’s high-velocity AK-47 rifles. Actually, the rounds or bullets used by the AK-47 were similar to those used by the enemy in the Russo-Japanese War, World War I, and World War II. The 7.62-mm round used in the AK-47 rifle was invented by the Japanese in the 1890s.

In the early 1970s, the prehospital system in the United States started to evolve. Previously, ambulances were usually hearses driven by morticians. That is why the early ambulances had a station wagon configuration. As the career paths of emergency medical technicians and paramedics grew, they started resuscitation in the field and continued it to the hospital. In 1978, the first ATLS course was given. To prevent shock, the ATLS course recommended that all trauma patients have two large-bore IV lines placed and receive 2 L of LR solution. The actual recommendation in the ATLS text specifically states that patients in class III shock should receive 2 L of LR solution followed by blood products. However, clinicians learned that crystalloid solutions seemed innocuous and improved BP in hypotensive patients.

In the 1980s and early 1990s, aggressive resuscitation was taught and endorsed. The two large-bore IV lines started in the field were converted to larger IV lines through a wire-guided exchange system. Central venous lines were placed early for aggressive fluid resuscitation. In fact, some trauma centers routinely performed cut-downs on the saphenous vein at the ankle to place IV tubing directly into the vein and thereby maximize flow during resuscitation.

Technology soon caught up, and machines were built to rapidly infuse crystalloid solutions. The literature was filled with data showing that ischemia to tissues resulted in disturbances of all types. Optimization of oxygen delivery was the goal. As a result, massive volumes of crystalloids were infused into patients. Residents were encouraged to “pound” patients with fluids. If trauma patients did not develop ARDS, it was taught that the patients were not adequately resuscitated, but many clinical trials eventually showed that prehospital fluids did not improve outcome ( Table 4.9 ).

Table 4.9
Prehospital fluid studies in trauma patients.
Article Summary of Findings
Aprahamian C, Thompson BM, Towne JB, et al. The effect of a paramedic system on mortality of major open intraabdominal vascular trauma. J Trauma. 1983;23:687–690. Paramedic system
Open intraabdominal vascular trauma
Kaweski SM, Sise MJ, Virgilio RW. The effect of prehospital fluids on survival in trauma patients. J Trauma. 1990;30:1215–1218. Prehospital fluids
Trauma patients
Bickell WH, Wall MJ Jr, Pepe PE, et al. Immediate versus delayed fluid resuscitation for hypotensive patients with penetrating torso injuries. N Engl J Med. 1994;331:1105–1109. Presurgery fluids
Hypotensive penetrating torso injuries
Turner J, Nicholl J, Webber L, et al. A randomised controlled trial of prehospital intravenous fluid replacement therapy in serious trauma. Health Technol Assess. 2000;4:1–57. Prehospital
1309 Serious trauma patients
Kwan I, Bunn F, Roberts I. Timing and volume of fluid administration for patients with bleeding following trauma. Cochrane Database Syst Rev. 2001;1:CD002245. Prehospital
Bleeding trauma patients
Dula DJ, Wood GC, Rejmer AR, et al. Use of prehospital fluids in hypotensive blunt trauma patients. Prehosp Emerg Care. 2002;6:417–420, 2002. Prehospital
Hypotensive blunt trauma patients
Greaves I, Porter KM, Revell MP. Fluid resuscitation in pre-hospital trauma care: a consensus view. J R Coll Surg Edinb. 2002;47:451–457. Prehospital
A consensus view
Dutton RP, Mackenzie CF, Scalea TM. Hypotensive resuscitation during active hemorrhage: impact on in-hospital mortality. J Trauma. 2002;52:1141–1146. Presurgery fluids
Hypotensive active hemorrhage
Dula DJ, Wood GC, Rejmer AR, et al. Use of prehospital fluids in hypotensive blunt trauma patients. Prehosp Emerg Care. 2002;6:417–420. Prehospital fluids
Hypotensive patients

Bleeding

One of the most influential studies on hemorrhagic shock was performed by Ken Mattox, and in 1994, the results were reported by Bickell and coworkers. The aim of Mattox’s study, a prospective clinical trial, was to determine whether withholding of prehospital fluids affected outcomes in hypotensive patients after a penetrating torso injury. IV lines were started in patients with penetrating torso trauma with BP lower than 90 mm Hg. On alternating days, patients received standard fluid therapy in the field or had fluids withheld until hemorrhage control was achieved. Withholding of prehospital fluids conferred a statistically significant survival advantage—a revolutionary, counterintuitive finding that shocked surgeons.

That 1994 article popularized the concept of permissive hypotension, that is, allowing hypotension during uncontrolled hemorrhage. The fundamental rationale for permissive hypotension was that restoration of BP with fluids would increase bleeding from uncontrolled sources. In fact, Cannon in 1918 had stated that “inaccessible or uncontrolled sources of blood loss should not be treated with IV fluids until the time of surgical control.”

Animal studies have validated the idea of permissive hypotension. Burris and colleagues have shown that moderate resuscitation results in a better outcome compared with no resuscitation or aggressive resuscitation. In a swine model of uncontrolled hemorrhage, Sondeen showed that raising BP with either fluids or pressors could lead to increased bleeding. The theory was that increasing BP would dislodge the clot that had formed. The study also found that the pressure that would cause rebleeding was a mean arterial pressure of 64 ± 2 mm Hg, with a systolic pressure of 94 ± 3 mm Hg and diastolic pressure of 45 ± 2 mm Hg. Other animal studies have confirmed these concepts.

The next question was whether the continued strategy of permissive hypotension in the operating room would result in improved survival. Dutton and associates randomized one group of patients to a target systolic BP of higher than 100 mm Hg and another group to a target systolic BP of 70 mm Hg. Fluid therapy was titrated until definitive hemorrhage control was achieved. However, despite attempts to maintain BP at 70 mm Hg, the average BP was 100 mm Hg in the low-pressure group and 114 mm Hg in the high-pressure group. Patients’ BP rose spontaneously. Titrating patients’ BP to the low target was difficult, even with less use of fluids. The survival rate did not differ between the two groups.

The idea of permissive hypotension was slow to catch on. The argument against allowing anything besides aggressive resuscitation was dismissed. Critics continued to emphasize that the Mattox trial focused only on penetrating injuries and should not be extrapolated to blunt trauma. Clinicians feared that patients with traumatic blunt head injuries would be harmed without a normalized BP. However, Shafi and Gentilello examined the National Trauma Data Bank and found that hypotension was an independent risk factor for death, but it did not increase the mortality rate in patients with TBIs any more than in patients without TBIs. The risk of death quadrupled in patients with hypotension, in both the TBI group (odds ratio, 4.1; 95% confidence interval, 3.5–4.9) and the nonTBI group (odds ratio, 4.6; 95% confidence interval, 3.4–6.0). Furthermore, in 2006, Plurad and coworkers showed that emergency department hypotension was not an independent risk factor for acute renal dysfunction or failure.

Trauma Immunology and Inflammation

The 1990s witnessed an explosion of information regarding alterations of homeostasis and cellular physiochemistry during shock. The scientific investigations of Shires, Carrico, Baue, and countless others shed light on the basic mechanisms underlying resuscitation of patients in shock. The pathophysiologic process has been identified as having an aberrant inflammatory status, resulting in the body’s immune system damaging the endothelial tissues and ultimately the end organ. This inflammatory state leads to a spectrum of conditions, including fluid sequestration, which leads to edema and progresses to acute lung injury, systemic inflammatory response syndrome, ARDS, and MODS. Such conditions were in every surgical ICU. Attention focused on biochemical perturbations and altered mediators as sites for possible interventions. The fundamental cause was thought to be that ischemia and reperfusion as shown in animal models would create a state of damage to the capillary endothelium and subsequent changes to the end organ. It was generally accepted that the reason for the reperfusion injury was mediated by activated neutrophils that emitted deleterious cytokines and released free oxygen radicals. The animal models used to study these concepts were actually ischemia-reperfusion models in which the superior mesenteric artery that supplied blood to the intestines was clamped for a prolonged time before the clamp was removed. Later it was thought that this was not an appropriate model to study hemorrhagic shock. It was found that there was a difference in pathophysiologic mechanisms between ischemia-reperfusion injury and resuscitation injury.

Death after traumatic injury was described as trimodal. Some patients died within a short time after injury, some died in the hospital within a few hours, and many died late in the hospital course. However, a more recent trial in trauma patients has shown that deaths occur in a logarithmic decay fashion and follow the rule of biology; no grouping of deaths can be seen, unless the data are represented or lumped together as immediate, early, or late. The only reason for the initial trimodal distribution was that patients who died after 24 hours were labeled under late deaths.

According to the traditional (although now discredited) trimodal pattern, the patients who typically died first could be aided by a better prehospital system and, more important, by injury prevention. For the second group of patients, better resuscitation and hemorrhage control was thought to be a potentially lifesaving intervention. For the third group (the late deaths), immunomodulation was considered to be key. The cause was thought to be the inflammatory adaptive aberrancy after successful resuscitation. When there is prolonged end-arteriole cessation of flow producing tissue ischemia for a time, followed by reperfusion, it is termed reperfusion injury . For example, with an injury to the femoral artery that requires 4 to 6 hours for circulation to be restored, muscle cells undergo ischemia and reperfusion in which the cells will start to swell, which can result in compartment syndrome in the lower leg. This ischemia and reperfusion were thought to occur after a period of hypotension. However, it is now known that the pathophysiologic change is due to resuscitation injury rather than to reperfusion injury.

With improved technology, the immunologic response after trauma was heavily researched. In the past, we were limited to studying physiology. A theory started to evolve that shock caused an aberrant inflammatory response that then needed to be modulated and suppressed. Many studies during this era showed that the inflammatory system was upregulated or activated after shock. The white cells in the blood became activated. Neutrophils were identified as the key mediators in the acute phase of shock, whereas lymphocytes are typically key players in chronic diseases (e.g., cancer and viral infections). Shock, caused by various mechanisms, was thought to induce ischemia to tissues and, after reperfusion, to set off an inflammatory response, which primarily affected the microcirculation and caused leaks ( Fig. 4.13 ).

Fig. 4.13, Hemorrhage causing neutrophil activation.

Typically, neutrophils are rapidly transported through capillaries. However, when they are signaled by chemokines, neutrophils will start to roll, firmly adhere to the endothelium, and migrate out of the capillaries to find the body’s foes and initiate healing. Early researchers thought that neutrophils would battle invaders (e.g., bacteria) through phagocytic activity and the release of oxygen-free radicals; this was thought to be the reason for the leak in the capillary system ( Fig. 4.14 ). Since neutrophils can be primed to have an enhanced response, a massive search took place to identify causes of neutrophil priming and downregulation. The many cytokines targeted included interleukin types 1 through 18, tumor necrosis factor (TNF), and adhesion molecules, such as intercellular adhesion molecules, vascular cell adhesion molecules, E-selectin, L-selectin, P-selectin, and platelet-activating factor.

Fig. 4.14, Intravascular neutrophils that are activated will adhere and roll until another set of mechanisms causes firm adherence and transendothelial migration out of the vascular system occurs. It is believed that this transmigration process injures the endothelium with the release of an oxygen free radical. This could result in fluid leaks out of the vascular system. ICAM, Intercellular adhesion molecules; PECAM, platelet–endothelial cell adhesion molecule.

That research had much overlap with the research being performed in the arenas of reimplantation, vascular ischemia, and reperfusion. Clinically, it was already known that the implantation of severed extremities would have pathophysiologic results similar to those from ischemia, reperfusion, and swelling caused by leaky capillaries. The immune response was described as bimodal. The first response was the priming by trauma or shock, followed by an exaggerated response when hit with a second insult (e.g., infection).

In the late 1990s, other researchers focused on the role of the alimentary tract. They knew that the splanchnic circulation was shunted of blood by vasoconstriction during hemorrhagic shock, so the gut suffers the most ischemia during shock and is the most susceptible to reperfusion injury. The animal model most often used to study the gut’s role in inflammation was a rat model of superior mesenteric artery occlusion and reperfusion. Because systemic inflammatory response syndrome is a sterile phenomenon, the gut was implicated as a potential player in the development of MODS. Animals were shown to have a translocation of bacteria into the portal system, and this initiation of the inflammatory cascade was investigated as the source of MODS. Investigators also knew that the release of Escherichia coli bacteria in the blood released endotoxins that further initiated release of cytokines (e.g., TNF, cachectin). However, studies in humans failed to demonstrate translocation of bacteria in intraoperative samples of portal vein during resuscitation. The problem was that, although complete occlusion of the superior mesenteric artery for hours followed by reperfusion does result in swollen, necrotic, injured bowel, these findings were extrapolated to humans undergoing hemorrhagic shock. Again, during hemorrhagic shock, the superior mesenteric artery is not occluded, and even at severe states, there is trickle flow of blood to the splanchnic organs.

Because patients who are in shock bleed and receive blood transfusions, transfusion of PRBCs was also implicated as the cause of MODS. Patients who required massive amounts of PRBCs were most likely to develop MODS. Researchers found that the use of older PRBCs was an independent risk factor for the development of MODS. PRBCs have a shelf life of 42 days in the refrigerated state. As blood ages, changes occur in the fluid that have been shown to affect the immune response negatively. However, randomized trials in cardiac and ICU patients have failed to identify worse outcomes in patients receiving older blood. The number of units transfused in these studies averages 3 to 4 units. Age of blood has not been studied in a randomized fashion in patients receiving massive transfusions.

In the past, when technology was limited, PRBCs were mainly tested for the red cells’ capability to carry oxygen and their viability under the microscope and in the body. Most major trauma centers now have learned to use leukoreduced PRBCs, that is, the small number of white cells that can release oxygen-free radicals and cytokines are now routinely filtered before the PRBCs are stored. Leukoreduction removes 99.9% of donor white cells and, in one large Canadian study, reduced the mortality rate from 7.03% to 6.19%. Other trauma studies have shown no reduction in the mortality rate but still showed a decrease in rates of infection, infectious complications, and late ARDS. To date, the largest study of leukoreduction in trauma patients has not shown any reduction in the rates of infection, organ failure, or mortality.

Numerous trials have examined the blockage of cytokines to treat septic patients. Two prospective, randomized, multicenter, double-blinded trials, the North American Sepsis Trial (NORASEPT) and the International Sepsis Trial (INTERSEPT), studied the 28-day mortality rate of critically ill patients who received anti-TNF antibody. Neither trial showed any benefit. Other trials testing other potential cytokines were disappointing as well. The cytokines tested included CD11/CD18, antiinterleukin 1 receptor, antiendotoxin antibodies, bradykinin antagonists, and platelet-activating factor receptor antagonists. The search continues for one key mediator that could be manipulated to solve the “toxemia” of shock. However, such attempts to simplify the events and to find one solution may be the main problem because there is no simple answer and no simple solution. The answer may lie in cocktails of substances. The humoral and endocrine systems, which are always mediated by blood, are exceedingly complex. Shock has many causes and mechanisms. Understanding this is crucial as we look for solutions.

Evolution of Modern Resuscitation

Detrimental Impact of Fluids

As early as 1996, the U.S. Navy used a swine model to study the effects of fluids on neutrophil activation after hemorrhagic shock and resuscitation. It was shown that neutrophils are activated after a 40% blood volume hemorrhage when followed by resuscitation with LR solution. That finding was not surprising. What was enlightening was that the level of neutrophil activation was similar in control animals that did not undergo hemorrhagic shock but merely received LR solution ( Fig. 4.15 ). In other control animals that did not receive LR solution but instead were resuscitated with shed blood or HTS after hemorrhagic shock, the neutrophils were not activated. The implication was that the inflammatory process was not caused by shock and resuscitation but by LR solution itself.

Fig. 4.15, Neutrophil activation in whole blood of swine measured by flow cytometry. The highest neutrophil activation followed hemorrhagic shock (Hem) and resuscitation (res) using lactated Ringer (LR) solution. Similar neutrophil activation occurred when the animal was not resuscitated but was infused with LR solution. No activation occurred when shocked animals were resuscitated with whole blood or 7.5% hypertonic saline (HTS) .

Those findings were repeated over several years in a series of experiments using human blood as well as in experiments in small and large animal models of hemorrhagic shock. When the blood was diluted with various resuscitation fluids, the inflammatory changes depended on the fluid used; despite similar physiologic results in vivo, the immunologic results were different ( Fig. 4.16 ). The response was ubiquitous throughout the entire inflammatory response system, including at the levels of deoxyribonucleic acid (DNA) and ribonucleic acid (RNA) expression.

Fig. 4.16, Human neutrophil activation using whole blood diluted with various resuscitation fluids, as measured by flow cytometry. Phosphate-buffered saline (PBS) was used because it has a pH of 7.4.

Ultimately, it was recognized that the inflammatory response was due to the various resuscitation fluids. The type and amount of fluids directly caused inflammation. All the artificial fluids used to raise BP could cause the inflammatory sequelae of shock. The type of fluids and the amount were responsible for the inflammatory response ( Table 4.10 ). What might be obvious today was not obvious then and was unrecognized for decades. It was not recognized that blood is extremely complex and that replacement or resuscitation with simple fluids other than blood had consequences. Blood does more than raise BP and carry red cells. In the past, we studied the complexity of the body’s immune response but failed to realize that fluids such as LR solution and normal saline that were developed more than 100 years ago are not ideal substitutes for blood when used in massive quantities.

Table 4.10
Summary of studies by U.S. Navy demonstrating fluids causing inflammation after resuscitation.
Article Model Summary of Findings
Rhee P, Burris D, Kaufmann C, et al. Lactated Ringer’s solution resuscitation causes neutrophil activation after hemorrhagic shock. J Trauma. 1998;44:313–319. Swine LR causes neutrophil activation; blood HTS does not.
Deb S, Martin B, Sun L, et al. Resuscitation with lactated Ringer’s solution in rats with hemorrhagic shock induces immediate apoptosis. J Trauma. 1999;46:582–588. Rats LR causes apoptosis in liver and gut more than HTS does.
Sun LL, Ruff P, Austin B, et al. Early up-regulation of intercellular adhesion molecule-1 and vascular cell adhesion molecule-1 expression in rats with hemorrhagic shock and resuscitation. Shock. 1999;11:416–422. Rats LR causes cytokine release more than HTS does.
Alam HB, Sun L, Ruff P, et al. E- and P-selectin expression depends on the resuscitation fluid used in hemorrhaged rats. J Surg Res. 2000;94:145–152. Rats LR causes increased E- and P-selectin expression more than HTS does.
Rhee P, Wang D, Ruff P, et al. Human neutrophil activation and increased adhesion by various resuscitation fluids. Crit Care Med. 2000;28:74–78. Human cells Artificial fluids cause neutrophil activation more than HTS and albumin do.
Deb S, Sun L, Martin B, et al. Lactated Ringer’s solution and hetastarch but not plasma resuscitation after rat hemorrhagic shock is associated with immediate lung apoptosis by the up-regulation of the Bax protein. J Trauma. 2000;49:47–53. Rats LR and hetastarch increase lung apoptosis compared with plasma whole blood, plasma, and albumin.
Alam HB, Austin B, Koustova E, et al. Resuscitation-induced pulmonary apoptosis and intracellular adhesion molecule-1 expression in rats are attenuated by the use of ketone Ringer’s solution. J Am Coll Surg. 2001;193:255–263. Rats Substituting ketones for lactate reduces pulmonary apoptosis and release of intercellular adhesion molecules.
Koustova E, Stanton K, Gushchin V, et al. Effects of lactated Ringer’s solutions on human leukocytes. J Trauma. 2002;52:872–878. Human cells d -LR causes inflammation more than l -LR does.
Alam HB, Stegalkina S, Rhee P, et al. cDNA array analysis of gene expression following hemorrhagic shock and resuscitation in rats. Resuscitation. 2002;54:195–206. Rats Different fluids cause gene expression at different levels.
Koustova E, Rhee P, Hancock T, et al. Ketone and pyruvate Ringer’s solutions decrease pulmonary apoptosis in a rat model of severe hemorrhagic shock and resuscitation. Surgery. 2003;134:267–274. Rats Ketone and pyruvate Ringer solutions protect against apoptosis compared with LR.
Stanton K, Alam HB, Rhee P, et al. Human polymorphonuclear cell death after exposure to resuscitation fluids in vitro: apoptosis versus necrosis. J Trauma. 2003;54:1065–1074. Human cells Artificial fluids cause apoptosis and necrosis.
Gushchin V, Alam HB, Rhee P, et al. cDNA profiling in leukocytes exposed to hypertonic resuscitation fluids. J Am Coll Surg. 2003;197:426–432. Human cells LR causes more cytokine release by gene expression than HTS does.
Alam HB, Stanton K, Koustova E, et al. Effect of different resuscitation strategies on neutrophil activation in a swine model of hemorrhagic shock. Resuscitation. 2004;60:91–99. Swine Artificial fluids cause neutrophil activation despite resuscitation rates.
Jaskille A, Alam HB, Rhee P, et al. d -Lactate increases pulmonary apoptosis by restricting phosphorylation of bad and eNOS in a rat model of hemorrhagic shock. J Trauma. 2004;57:262–269. Rats d -Lactate in fluids causes more apoptosis than l -lactate does.
cDNA ; Complementary deoxyribonucleic acid; HTS , hypertonic saline; LR , lactated Ringer solution.

Further investigations showed that when the lactate in LR solution was replaced with other sources of energy that could be better used by the mitochondria, the inflammatory aspects were attenuated. One such novel fluid was ketone Ringer solution ( Table 4.11 ). Lactic acid occurs in two stereoisomeric forms as well as in a true racemic mixture of the isomers. In biologic systems, the true racemic mixture or equal molarity of the isomers rarely occurs. Usually, one or the other isomer predominates. The stereoisomers are named l (+) and d (−) lactic acid. l (+)-lactate is a normal intermediary of mammalian metabolism. The isomer d (−)-lactate is produced when tissue glyoxalase converts methylglyoxal into a lactic acid of the d form, such as in lactose-fermenting bacteria. l (+)-lactate has low toxicity as a consequence of the rapid metabolism. d (−)-lactate, however, has higher toxic potential. Psychoneurotic disturbances have been described with pure d (−)-lactate. Increasing evidence has indicated a connection between high plasma concentration of racemic lactate and anxiety and panic disorders. Racemic dialysis fluids have reportedly been associated with clinical cases of d -lactate toxicity. Experiments with the isomers have shown that d (−)-lactate causes significant inflammatory changes in rats and swine as well as activation of human neutrophils.

Table 4.11
Components of ketone Ringer solution.
Component Normal Saline (mEq/L) d -LR (mEq/L) l -LR (mEq/L) Ketone Ringer (mEq/L)
d -Lactate 14
l -Lactate 14 28
3- d -β-Hydroxybutyrate 28
Sodium 154 130 130 130
Potassium 4 4 4
Calcium 3 3 3
Chloride 154 109 109 109
Replacing lactate with an alternative fuel source such as ketone affected the immunologic response after resuscitation. LR , Lactated Ringer solution.

In 1999, with the new information implicating LR solution as the cause of ARDS and MODS, the U.S. Navy contracted with the Institute of Medicine to review the topic of the optimal resuscitation fluid. The report made many recommendations; key recommendations were that LR solution be manufactured with only the l (+) isomer of lactate and that researchers continue to search for alternative resuscitation fluids that do not contain lactate but rather other nutrients, such as ketones. It stated that the optimal resuscitation fluid is 7.5% HTS because of the decreased inflammation associated with it as well as its logistic advantage in terms of weight and size. Although the Institute of Medicine had been asked to make recommendations for the military, the report’s authors thought that the evidence was applicable to civilian injuries as well. The U.S. military also requested Baxter, among other manufacturers of LR solution, to eliminate d (−)-lactate in LR solution, which it has done. The LR solution from Baxter currently contains only the l (+)-lactate isomer.

HTS has a long record of research and development. It has been used in humans for decades and has been consistently shown to be less inflammatory than LR solution. This represented a paradigm shift in recognizing that LR solution and normal saline may be detrimental. Again, blood is complex, and the fluids used in the past were a poor replacement.

It was also being recognized that PRBCs are different from whole blood and a poor replacement of whole blood lost during hemorrhage. PRBCs are separated by centrifuge, washed, and then filtered. Much of the plasma and its content are decanted out. Clotting factors, glucose, hormones, and cytokines crucial for signaling are not in PRBCs or in most of the fluids formerly used for resuscitation. Evidence that the fluid type affects the inflammatory response is now growing and has been confirmed in a number of studies.

The Committee on Tactical Combat Casualty Care was formed in 2000 by the U.S. Navy and now sets policy on the prehospital management of combat casualties. Their recommendations and algorithm for resuscitation were revolutionary compared with the civilian recommendations ( Fig. 4.17 ). The algorithm was formed with the following points in mind:

  • 1.

    Most combat casualties do not require fluid resuscitation.

  • 2.

    Oral hydration is an underused option as most combat casualties require resuscitation.

  • 3.

    Aggressive resuscitation has not been shown to be beneficial in civilian victims of penetrating trauma.

  • 4.

    Moderate resuscitation in animal models of uncontrolled hemorrhage offers the best outcome.

  • 5.

    Large volumes of LR solution are not safe.

  • 6.

    Colloid or HTS offers a significant advantage in terms of less weight and cube for the military medic or corpsman.

Fig. 4.17, New recommendation for fluid resuscitation from the U.S. military by the Committee on Tactical Combat Casualty Care.

The resuscitation fluids of choice for casualties in hemorrhagic shock, listed from most to least preferred are whole blood; plasma, RBCs, and platelets in a 1:1:1 ratio; plasma and RBCs, in a 1:1 ratio; plasma or RBCs alone; and Hextend and crystalloid (LR solution or Plasma-Lyte A).

As crystalloids were being recognized as potentially harmful to bleeding patients, a consensus panel of military experts recommended that a plasma volume expander, 6% hetastarch (Hextend), should be the nonblood fluid of choice for the military. The rationale was that even though the Institute of Medicine recommended 7.5% HTS, it was not commercially available and it was not approved by the FDA for use in bleeding patients. The panel believed that a colloid offered the benefit of less weight and cube, meaning that the average medic could resuscitate patients with one-third of the volume (compared with HTS) and would not have to carry large bags of LR solution or normal saline in the field. It was recognized that most casualties were not undergoing hemorrhagic shock and were not in any jeopardy of bleeding to death. Only a minority of patients required fluid resuscitation in the field. Surgeons and anesthesiologists generally would prefer all patients to be nil per os (NPO; nothing by mouth) to avoid aspiration during induction of anesthesia and surgery, but trauma patients are never NPO. With the rapid sequence induction of anesthesia, aspiration is a minimal risk. The committee recommended placing an IV line, but not administering IV fluid, in casualties with normal mentation and normal radial pulse character. Instead, oral hydration was advised. In those undergoing hemorrhagic shock manifested by altered mental status and decreased pulse amplitude, they recommended administering 500 mL of Hextend. The use of Hextend was limited to 1 L, given its potential for exacerbating coagulopathy.

Damage Control Resuscitation

Once crystalloid solutions were recognized as possibly being the primary cause of the inflammatory process after traumatic hemorrhagic shock, efforts were made to reduce their use in the battlefield. Abdominal compartment syndrome ( Fig. 4.18 ), which had been described after aggressive resuscitation, was also found to be directly associated with the volume of crystalloid infused. Thus, the concept of damage control resuscitation or hemostatic resuscitation was developed. It involved concentrating on rapid control of bleeding as the highest priority; using permissive hypotension because this would minimize the use of acellular fluids as well as potential disruption of natural clot formation; minimizing the use of crystalloid solutions; using HTS to reduce the total volume of crystalloid necessary; using blood products early; and considering the use of drugs, such as PCC and TXA, to stop bleeding and to reduce coagulopathy ( Box 4.2 ). The rationale for the early use of blood products was that large volumes of crystalloids were detrimental; if whole blood is available, this should be used first, followed by component therapy with PRBCs, thawed plasma, and platelets in a 1:1:1 ratio that would approximate whole blood and minimize the use of acellular fluids. Component therapy is not as ideal compared with fresh whole blood, but due to logistic problems, it was not always readily available, and component therapy was used empirically for massively bleeding patients with ongoing uncontrolled hemorrhage. Mental status was thought to be a useful guide to determine who needed care; the use of the radial pulse was preferred to BP cuffs, which are not practical when personnel are under fire in the combat setting.

Fig. 4.18, Patient after damage control surgery with abdominal and thoracic compartment syndrome caused by massive fluid resuscitation.

Box 4.2
Components of damage control or hemostatic resuscitation.
From Dellinger RP, Levy MM, Carlet JM, et al. Surviving Sepsis Campaign: International guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med. 2008;36:296–327.
FFP , Fresh frozen plasma; PRBCs , packed red blood cells; TXA , tranexamic acid.

  • Permissive hypotension until definitive surgical control

  • Minimize crystalloid use

  • Initial use of 5% hypertonic saline

  • Early use of blood products (PRBCs, FFP, platelets, cryoprecipitates)

  • Consider drugs to treat coagulopathy (rFVIIa, prothrombin complex concentrate, TXA)

With the promotion of damage control resuscitation, clinical studies indicated that aggressive early use of blood products, such as PRBCs and FFP, actually reduced the total volume of PRBCs used by 25%. These studies also used permissive hypotension and focused on surgical control of hemorrhage rather than on resuscitation first. Other studies have shown that, with damage control resuscitation, the incidence of ARDS decreased from 25% of ICU admissions to 9%. ARDS now occurs in patients with pulmonary contusion, long bone fractures, pneumonia, or sepsis, but it is no longer a routine complication in trauma patients who undergo damage control resuscitation.

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here