Introduction

Because of a sophisticated and efficient thermoregulatory system, humans are able to maintain their core body temperature constant within ±0.2°C of its normal value of 37.0°C despite changing ambient temperatures. This characteristic defines a homeothermic organism in contrast to a poikilothermic organism, where body temperature approximates ambient temperature (e.g., reptiles).

The German physician Karl R. A. Wunderlich (1815–1877) should be regarded as one of the pioneers in temperature monitoring and thermoregulation research. He was the first to systematically measure body temperature under various conditions, introduce thermometers into clinical practice, and define the normal axillary temperature of 37.0°C with a range of 36.2° to 37.5°C. A systematic literature review by defined the normal range of temperature for adults (men and women) for axillary to be 35.5° to 37.0°C, oral 33.2° to 38.2°C, rectal 34.4° to 37.8°C, and tympanic 35.4° to 37.8°C. At present, the interval of 36.8° to 37.2°C still defines the interthreshold range and represents the maximal deviations in core body temperature that are tolerated by the thermoregulatory system without triggering an effector response.

In humans, the core (or central) body temperature refers to the temperature of the vessel-rich group organs (i.e., brain, heart, lungs, liver, and kidneys). The skin represents the compartment acting as a barrier to the environment. The musculoskeletal system, which makes up the main component of the peripheral compartment, can be seen as a dynamic buffer in the thermoregulatory system between the core and the skin.

Accomplished by a delicate system balancing heat generation and loss, not many physiologic parameters are as vigorously and effectively controlled as core temperature. Nevertheless, the thermoregulatory system’s ability to dissipate or generate heat by means of skin blood flow regulation, sweat production, changes in minute ventilation, and metabolism can easily be overwhelmed by external factors of extreme ambient conditions.

Thermoregulation is also subjected to circadian rhythms, and some of them are already present in the first days of life (although usually less pronounced at a very young and very old age) and are generally closely associated with the sleep/wake cycle. Body temperature is usually lower in the morning and higher later in the afternoon and evening. The luteal phase of the menstrual cycle in fertile women induces a slightly higher body temperature ( ).

Anesthesia and surgery are powerful modulators of thermoregulation, and minor changes in body temperature may result in significant changes of tissue and cellular function. Therefore the need to monitor and maintain core temperature during anesthesia within these narrow physiologic limits is essential.

This chapter discusses the relative merits of different anatomic sites for temperature monitoring, the principles and physiology of thermoregulation in adults and children, the influence of anesthetic agents on thermoregulation, the physiologic consequences of hypothermia and hyperthermia, and techniques to limit perioperative temperature disturbances.

Temperature monitoring

Whereas most countries measure temperature in degrees Celsius (°C), a few use degrees Fahrenheit (°F). A body temperature of 37.0°C is equivalent to 98.6°F. The following formulas allow for conversion from one unit to the other:

Celsius = [(°Fahrenheit − 32) × 5]/9
Fahrenheit = (9 × °Celsius/5) + 32

Perioperative detection of changes in body temperature requires accurate monitoring and appropriate monitoring sites. National anesthesia guidelines for a vast number of countries require that one method for measuring body temperature during anesthesia be readily available ( ; ; ).

Although mercury thermometers were standard for decades, the most common thermometers currently used in the perioperative setting are thermocouples and thermistors. A thermocouple consists of two wires made of different metals, often copper and constantan (a copper-nickel-manganese-iron alloy). Along the junction of any two different metals from the so-called thermoelectric series, a small electrical current is produced, and it is here where temperature is measured as the voltage change and depends on the temperature gradient between the thermocouple junction.

American inventor Samuel Ruben was the first to point out in 1930 that electrical resistance changes exponentially with different temperatures. This forms the basis of the thermistor-type thermometer, which consists of a semiconductor resistor made of a tiny piece of metal (copper, nickel, manganese, or cobalt). The change in electrical resistance is analyzed to measure temperature. Both thermocouple and thermistor probes are inexpensive and considered sufficiently accurate for clinical purposes.

Infrared thermometers are quite popular in postanesthesia care units and on hospital wards, but they are not suitable for continuous temperature monitoring during anesthesia. These thermometers are attractive for clinical use due to their fast response time, but their accuracy in clinical practice has not been confirmed, particularly because they require proper measuring techniques ( ; ; ; ; ; ; ).

Clinically accessible sites for core temperature measurement are tympanic membrane, nasopharynx, distal esophagus, pulmonary artery, and, with some limitations, bladder and rectum. All these sites usually provide similar readings in awake humans and in anesthetized humans undergoing noncardiac surgery ( ). However, different monitoring sites may measure different temperatures under certain clinical conditions, and the physiologic and clinical significance of these differences may vary. The precision and accuracy of measurements at different body sites have been studied, and each site has its advantages and disadvantages ( ; ).

A pulmonary artery catheter with a distal-tip thermistor represents the gold standard for core temperature measurement. However, due to its invasive nature, its use—particularly in the pediatric population—has been limited to very special situations.

Skin temperature measurement is often used intraoperatively given its availability and ease of use. However, it lacks accuracy as a reflection of core temperature ( ; ).

Tympanic membrane temperature has been suggested to be the most ideal temperature-monitoring site. Although it is not necessary for the temperature probe to be in direct contact with the tympanic membrane to accurately reflect tympanic temperature, the external auditory canal needs to be sealed to the outside by the probe to allow the air column trapped between the probe and the tympanic membrane to reach a steady-state temperature. In the initial postoperative period in pediatric patients after open heart surgery, tympanic temperature does not correlate well with brain temperature and therefore does not provide a reliable estimate of central body temperature ( ; ). Due to the difficulties associated with obtaining appropriate-sized thermistors and reports of tympanic membrane perforation, the clinical use for continuous intraoperative temperature measurement has been discouraged.

For a nasopharyngeal temperature probe to closely reflect core temperature, its tip needs to be placed in the posterior nasopharynx close to the soft palate. This should provide a good approximation of core temperature under anesthesia. However, the combination of an uncuffed endotracheal tube with a moderate to large air leak may lead to falsely low readings due to the leak airflow. In contrast, oral temperature is generally considered inadequate and not recommended for accurate intraoperative temperature monitoring ( ).

The esophageal temperature probe allows for availability and ease of temperature monitoring. Historically, it was combined with an esophageal stethoscope, which makes this site particularly attractive for the pediatric population. In infants, children, and cachectic patients, the thermal insulation between the tracheobronchial tree and the esophagus is usually minimal. Therefore the respiratory gas flow may result in erroneous temperature readings, particularly when the fresh gas flow is high and its temperature differs significantly from body temperature ( ). To measure core temperature, the tip of the probe needs to be placed in the distal third of the esophagus and retrocardiac ( ; ).

Axillary temperature is not only the most commonly used but also the most convenient site for temperature monitoring. However, its accuracy is achieved only when the tip of the thermometer is carefully placed over the axillary artery and the arm is closely adducted ( ). Unfortunately, frequent malpositioning of the probe may result in unreliable estimates of core temperature, and infusion of cool intravenous solutions at high flow rates in small children on the ipsilateral side of the thermometer probe may result in falsely low temperature readings.

Rectal temperature monitoring is associated with minimal morbidity, and its ease of insertion confers major advantages ( ). Problems to be considered resulting in erroneous readings with its use pertain to probe insulation by feces (and therefore slow response to temperature changes), its exposure to cooler blood returning from the legs, the influence of an open abdominal cavity during laparotomy, or irrigations of the bladder or the abdomen with either cold or warm solutions. Relative contraindications for rectal temperature probe insertion are patients suffering from inflammatory bowel disease or ano-rectal malformations, neutropenia and/or thrombocytopenia, and patients whose bowel or bladder is to be irrigated (inaccuracy). The depth of probe insertion seems to further affect the accuracy. studied young adult males while they were exercising and noted that probe insertions of 4 cm provided the shortest latency, the best responsiveness, and the most accurate change in temperature compared with insertion depths of over 10 cm.

Bladder temperature monitoring is considered to be one of the most accurate core temperature measuring sites. Its precision has been demonstrated to be identical to pulmonary artery temperature monitoring as long as urinary output is adequate. However, when urinary output is diminished, this site may become inaccurate in reflecting central temperature ( ; ).

The choice of the temperature monitoring site is often guided by the surgical procedure. For cardiac surgery patients, in whom temperatures from different body sites convey useful information, temperature is usually measured in at least two sites (e.g., rectum or bladder, and esophagus or nasopharynx). For pediatric patients undergoing a short surgical procedure not requiring endotracheal intubation, either rectal or axillary temperature monitoring can be used safely. If the child is intubated, the use of a distal esophageal temperature probe should be considered.

Physiology of thermal regulation

Survival from body temperatures as low as 13.7°C has been reported, whereas death resulting from protein denaturation occurs within 7°C above normality at approximately 44°C ( ). Humans have a significantly higher tolerance to cold than to heat, and this explains why it is vital that the system for heat dissipation needs to be much more effective than the heat-generation system.

The thermoregulatory system is similar to other physiologic control systems in the sense that the brain uses negative feedback mechanisms to keep core temperature fluctuations minimal. In this traditional engineering model to describe thermoregulation, a set point system for the hypothalamic integration of thermal information provides an explanation of how the thermoregulatory system functions and how temperature is regulated. Temperature control is considered to consist of a complex afferent system conveying thermal information from different areas of the body to a unified central controller, which compares the afferent thermal information with its set point temperature. The central controller then triggers the necessary responses via the efferent system to keep body temperature deviations from that set point minimal.

In this traditional approach to thermoregulation, the principal site of temperature regulation is the preoptic anterior part of the hypothalamus (POAH). The POAH integrates afferent signals from temperature-sensitive cells found in most tissues throughout the body, including other parts of the brain, spinal cord, central core tissues, respiratory tract, gastrointestinal tract, and the skin surface. The processing and regulation of the thermoregulatory information occurs in three stages: afferent thermal sensing, central controlling, and efferent response.

Afferent thermal sensing

Anatomically distinct warm and cold receptors in the periphery of the body (mainly skin, oral, and genitourinary mucosa) sense the ambient temperature. The skin contains approximately 10 times more cold receptors (located in or just beneath the epidermis, with a maximal discharge rate at temperatures between 25° and 30°C) than warm receptors (located a little deeper in the skin than the cold receptors, with a maximal discharge rate between 45° and 50°C), underlining the skin’s importance in sensing environmental temperature in general and detection of cold in particular ( ). Thermosensitive, predominantly warm receptors accounting for the threat of overheating, are also located near the great vessels, the viscera, the abdominal wall, in the brain itself (pons, medulla oblongata), and in the spinal cord. All the information from these peripheral receptors is conveyed to the POAH, where it is integrated and appropriate efferent impulses are sent out to keep body temperature constant ( ).

Although originating from anatomically different nerve fibers, the speed of transmission of thermal impulses from peripheral thermoreceptors is mainly influenced by the intensity of the stimulus rather than the type of nerve fibers used. It is well established that the rate of skin temperature change alters its apparent importance. Rapid changes contribute up to five times as much to the central regulation as slower changes with comparable intensity ( ). Thermal information from cold-sensitive receptors is transmitted to the preoptic area of the hypothalamus by A-delta fibers. Thermal information gathered from peripheral warm receptors travels via unmyelinated C-fibers. These fibers also convey pain sensations, explaining why intense heat cannot be reliably distinguished from severe pain ( ; ). Most peripheral thermoreceptors exhibit a phasic activity with increased firing during changing temperatures and quick adaptation to a stable temperature, allowing the organism to quickly react to environmental changes ( ).

Central regulation

Processing of afferent thermal information takes place in the POAH, whereas the posterior hypothalamus controls the efferent pathways to the effectors. Thermal inputs from the skin surface, spinal cord, and deep body tissues are integrated in the POAH and compared with the threshold temperatures, triggering either heat gain or loss. Once the threshold has been reached, the hypothalamus then carefully orchestrates the mechanisms for heat generation or dissipation in order to maintain body temperature within the narrow limits of its set point (or, more precisely, its interthreshold range).

The POAH itself contains heat- and cold-sensitive neurons, with the former predominating by 4:1 ( ). However, the vast majority of the neurons in this area are temperature-insensitive ( ). Because this area also receives and processes nonthermic afferent information, the POAH seems to play an integral part in controlling the adaptive mechanisms and the behavior of the organism ( ).

Direct heat stimulation of this area results in increased discharge rates from the heat-sensitive neurons with activation of heat loss mechanisms. Conversely, hypothalamic cold-sensitive neurons respond to direct cooling with increased discharge rates from the POAH, resulting in activation of heat gain mechanisms ( ; ). Other central nervous areas involved in thermoregulation include the dorsomedial hypothalamus, periaqueductal gray matter, and nucleus raphe pallidus in the medulla oblongata and the spinal cord, although their function remains to be fully elucidated ( ; ; ; ).

The contribution of the central thermoreceptors to thermal regulation under normal conditions, however, is limited by the marked predominance of the thermal input from peripheral receptors ( ). These central receptors take over thermoregulation once the sensory input from peripheral sensors is disrupted (e.g., central neuraxial anesthesia or spinal cord transsection), but they are less efficient compared with peripheral thermoreceptors ( ).

The threshold temperature defines the central temperature at which a particular thermoregulatory effector is activated. When the integrated input from all sources is signaling that the interthreshold range is exceeded on either side, efferent responses are initiated from the hypothalamus in an attempt to maintain normal body temperature ( Box 7.1 and Fig. 7.1 ).

BOX 7.1
Definition of Temperature Regulation Terms

  • Threshold temperature: Central temperature that elicits a regulating effect to maintain normothermia (e.g., vasoconstriction, vasodilatation, shivering, nonshivering thermogenesis, and sweating).

  • Interthreshold range: Temperature range over which no thermoregulatory responses are triggered.

  • Gain: Intensity of the thermoregulatory response.

  • Mean body temperature: Computed and physiologically weighted body temperature averaged from various tissues.

  • Shivering: Heat production through involuntary skeletal muscle activity.

  • Nonshivering heat production: Metabolic thermogenesis above basal metabolism not associated with muscle activity.

  • Dietary thermogenesis: Heat production through metabolism of nutrients.

Fig. 7.1, The Thermoregulatory Pathways.

The slope of the response intensity plotted against the difference between the thermal input temperature and the threshold temperature is called the gain of that response (i.e., the intensity of the response). The difference between the lowest temperature before warm responses are triggered and the highest temperature before cold responses are triggered indicates the thermal sensitivity of the system. As previously stated, the interthreshold range defines the temperature range over which no regulatory responses occur. This range changes from approximately 0.4°C in the awake state to approximately 3.5°C during general anesthesia (depending on the drugs used for anesthesia).

The mechanism by which the body determines the absolute threshold temperatures is not known, but it appears that the thresholds are influenced by multiple factors, such as plasma concentrations of sodium, calcium, thyroid hormones, tryptophan, general anesthetics, and other drugs, as well as circadian rhythm, exercise, pyrogens, food intake, and cold and warm adaptation. Central regulation is fully functional already in the term neonate, but it may be impaired in the premature, elderly, or critically ill patient.

Efferent response

It is known that regulatory responses are mainly based on mean body temperature, a physiologically weighted average temperature that reflects the thermoregulatory importance of various tissues, but in particular that of the central compartment. The skin is basically the temperature sensor of the environment, and skin temperature is the most important parameter in triggering behavioral changes. However, in terms of impact on the thermoregulatory autonomic response, the thermal input from the skin contributes only about 20% ( ; ). The main part of the autonomic response depends on afferent information from the central core, which includes the brain (the hypothalamus and other areas), the spinal cord, and deep abdominal and thoracic tissues, with each of them contributing about 20% to the central thermoregulatory control ( ; ; ).

The thermal steady state is actively defended by the POAH’s responses to thermal changes—that is, temperatures exceeding the interthreshold range on either side. Thus thermal deviations from the threshold temperature initiate efferent responses that in cold defense either increase metabolic heat production (shivering or nonshivering thermogenesis) and/or decrease environmental heat loss (active vasoconstriction and behavioral changes) and in heat stress increase heat loss (active vasodilatation, sweating, and behavioral maneuvers).

Efferent responses (behavioral changes, cutaneous vasoconstriction or vasodilatation, nonshivering thermogenesis, shivering, and sweating) all appear to be mediated according to the central interpretation of the afferent input. Cutaneous vasoconstriction is the first and most consistent thermoregulatory response to hypothermia. Total digital skin blood flow can be divided into a nutritional (capillaries) and a thermoregulatory (arteriovenous shunts) component. Cold-mediated vasoconstriction is most pronounced (down to 1% of the normal blood flow seen in a thermoneutral environment) in arteriovenous shunts of hands, feet, ears, lips, and nose ( ; ).

Flow not only changes in the arteriovenous shunts but also in the far more numerous capillaries ( ). The impressive decrease in cutaneous perfusion secondary to thermoregulatory vasoconstriction results in a heat loss reduction of 50% from hands and feet but only 17% from the trunk, resulting in an overall heat loss reduction of approximately 25% ( ).

Voluntary muscle activity, nonshivering thermogenesis, and shivering are the efferent mechanisms that lead to heat generation. In contrast, warm exposure initially results in sweating, which triggers profound precapillary vasodilatation, with a marked increase in skin blood flow. This allows for huge amounts of heat to be transported to the skin, from where it then dissipates to the environment, mainly by evaporation due to the preconditioning by sweat.

The thermoneutral zone is defined as the ambient temperature range at which the oxygen demand (as a reflection of metabolic heat production) is minimal and temperature regulation is achieved through nonevaporative physical processes only (i.e., vasoconstriction or vasodilatation) ( ). Despite the effectiveness of the thermoregulatory system, behavioral responses (heating the home, seeking shelter, putting on a jacket, etc.) to environmental temperatures outside the thermoneutral zone remain an important thermoregulatory effector in humans ( ).

Heat-loss mechanisms

The ability to generate or dissipate heat is fundamental for homeothermic organisms. Controlled heat loss in homeotherms is accomplished in two stages, both governed by the physical laws of conduction, radiation, convection, and evaporation ( ). The second law of thermodynamics states that heat can only be transferred from a warmer to a cooler object but never from a cooler to a warmer object. What this means is that the warmer object (in the operating room setting, this is almost exclusively the patient) is used to warm up the surrounding cooler objects (operating room walls, tables, instruments, etc.). One has to consider that without any heat loss to the environment (i.e., perfect insulation), the body of an awake adult at rest would warm up by at least 1°C per hour (during exercise, the metabolic heat generation can increase up to 10-fold) ( ).

The first stage of heat loss during anesthesia results from heat transfer from the body core (central compartment) to the periphery and the skin surface, which is referred to as the concept of internal redistribution of heat and is caused by peripheral vasodilatation, allowing warm blood from the core to be redistributed to the distal extremities and in return bringing cooler blood from the periphery back to the core. This redistribution accounts for approximately 80% of the decrease in core body temperature in adults in the first hour of anesthesia. In the following 2 hours, redistribution only accounts for about 40% of total temperature reduction. Over the entire 3-hour period of anesthesia, redistribution was responsible for 65% of temperature reduction ( ).

During the second stage, heat is dissipated from the skin surface to the environment. Physiologic manipulations of regional blood flow and changes in the thermal conductance properties of the insulation tissue can influence both gradients. Several studies of thermal regulation in infants and children have quantified the relative contributions of radiation, convection, evaporation, and conduction to heat loss. A study in newborns in a thermoneutral environment found radiation, convection, evaporation, and conduction to account for 39%, 34%, 24%, and 3% of total heat loss, respectively ( ). The heat loss mechanisms involved in the operating room setting are summarized in Fig. 7.2 .

Fig. 7.2, The Four Mechanisms Contributing to Perioperative Hypothermia.

Radiation

Radiant heat loss refers to transfer of heat between two objects of different temperatures that are not in contact with each other (e.g., radiation is the mechanism by which the sun warms the earth). The emitted radiation carries the energy from the warmer to the cooler object, thus causing the warmer object to cool and the cooler object to warm.

Newborns and infants have a large surface area–to–mass ratio, so radiant heat loss is proportionally greater the smaller the infant. In both the awake and the anesthetized infant, radiation is the major factor for heat loss under normal conditions. The human body is an excellent emitter of energy at wavelengths relevant to heat transfer, and the probability of photon reflection in the standard operating room is almost zero. Radiant heat loss in the operating room is therefore a function of the temperature difference between the patient’s body and the room (i.e., the floor, walls, and ceiling) and all the objects in it. Warming up the operating room (and its contents) reduces the temperature gradient between patient and environment and therefore radiant heat loss. However, as long as a temperature gradient exists, the patient continues to warm up the surrounding environment (assuming the patient is warmer than the rest of the operating room). At a room temperature of 22°C, about 70% of the total heat loss is due to radiation ( ).

Convection

Convective heat loss describes the transfer of heat to moving molecules such as air or liquids. The thin layer of air directly adjacent to the skin is warmed by conduction from the body, from where heat then is lost by convection. Changes in body posture and minute ventilation may also affect convective heat loss. In the case of a naked individual exposed to air, the rate and direction of convective heat exchange depend on airflow velocity (“wind speed”) and the temperature difference between air and skin surface. Convective heat loss in the operating room is mainly due to air-conditioning systems, which are required to exchange the entire air 15 to 25 times per hour, resulting in a constant draft that for a medium-size operating room can easily exceed 1 million liters of air per hour.

Evaporation

Evaporative heat loss occurs primarily through the skin and the respiratory system via the process of losing heat through conversion of water to gas. Under conditions of thermal neutrality, evaporation accounts for 10% to 25% of heat loss. Physical factors governing evaporative heat loss include the relative humidity of the ambient air, airflow velocity, and lung minute ventilation. The driving force behind evaporation is the vapor pressure difference between the body surface and the environment. Evaporative losses include mainly three components: sweat (sensible water loss); insensible water loss from the skin, respiratory tract, and open surgical wounds; and evaporation of liquids applied to the skin, such as antibacterial solutions (“skin prep”). The evaporation of water from a surface is an energy-dependent process—energy that is absorbed from the surface during transition from the liquid to the gaseous state. This energy is called the latent heat of vaporization, and in the case of sweat, has a value of 2.5 · 10 6 J/kg (if all sweat is evaporated; sweat dripping from the body does not contribute to evaporative heat loss). This emphasizes the extraordinary power of the human sweating mechanism as a means of heat dissipation, especially considering that an adult in excellent physical condition can produce up to 2 to 3 liters of sweat per hour ( ; ). In an environment where the air temperature is equal to or higher than the skin temperature, sweating is the only mechanism available for dissipation of heat originating from metabolic production. In this situation, anything that limits evaporation, such as high ambient humidity or impermeable clothing, may easily lead to overwhelming heat retention, with a potentially fatal rise in body temperature.

Physiologic factors affecting evaporative heat losses relate to the infant’s ability to sweat and to increase minute ventilation. Although the physical characteristics of the newborn predispose to heat loss, it has been demonstrated that neonates are capable of sweating in a warm environment ( ). Full-term neonates begin to sweat when rectal temperature reaches 37.5° to 37.9°C and ambient temperature exceeds 35°C. Although the onset of sweat production in infants small for gestational age is slower than in full-term infants, the maximum rates of sweat production are comparable ( ). However, premature infants with a gestational age below 30 weeks show no sweating response because the lumens of their sweat glands are not fully developed ( ; ).

Only a small amount of heat is lost when dry, inspired respiratory gases are humidified by water evaporating from the tracheobronchial epithelium. In adults, respiratory losses account for less than 10% of total heat loss during anesthesia ( ), and total insensible losses account for approximately 25% of the total heat dissipated. Respiratory heat loss increases if the patient breathes cool, dry air as opposed to warm, moisturized air ( ).

Evaporative heat loss from a large surgical incision may equal or surpass all other sources of intraoperative heat loss combined ( ). Due to increased evaporative heat loss, hypothermia is also more likely to occur if the skin of the patient is wet or comes in contact with wet drapes. Coverage of the surgical field with a self-adhesive, transparent conformable film (e.g., Opsite, Tegaderm) that extends onto the surgical drapes allows fluids leaking from the wound (e.g., blood, effusions, ascites) to be directed away from the patient, thereby avoiding the pooling of fluids under the patient, which otherwise makes temperature control much more challenging, if not impossible.

Conduction

Conduction describes heat transfer between two surfaces in direct contact. The amount of heat transferred depends on the temperature difference between the two objects in contact, the surface area of contact, and the conductive heat transfer coefficient of the materials. During surgery, relatively little heat should be lost to the environment via conduction, because the patient is supposed to be well protected from direct contact to cold surrounding objects ( ).

Attention should hence be paid to ensure that the patient’s skin is not in contact with any metallic surfaces, because metals have a high thermal conductivity, thereby facilitating heat transfer. The physiologic factors controlling conductive heat loss are cutaneous blood flow and the thickness of the subcutaneous tissue (insulation). However, conduction is also responsible for heat loss due to warming up cold intravenous fluids and irrigations, which have the potential to significantly and quickly decrease body temperature.

Heat generation

The ability to produce heat by increasing the metabolic rate and oxygen consumption is the other prerequisite of thermal regulation for a homeothermic organism ( ). Besides the fact that three of the physical mechanisms leading to heat loss (i.e., conduction, radiation, and convection) can theoretically also be used to warm up a patient by reversing the temperature gradients, the body has the ability to actively produce heat.

Heat generation can be achieved through four mechanisms:

  • 1.

    Nonshivering thermogenesis

  • 2.

    Voluntary muscle activity

  • 3.

    Involuntary muscle activity (shivering)

  • 4.

    Dietary thermogenesis

Because the behavioral aspect of heat production (voluntary muscle activity) is usually not functional in the perioperative period, its role in heat production will not be discussed further here. Of the three remaining mechanisms for heat production, nonshivering thermogenesis is the major contributor in newborns and infants, whereas shivering thermogenesis is the main mechanism for heat production in older children and adults.

Although the time course and relationship between nonshivering and shivering thermogenesis in infants have been described, the exact time sequence and factors involved in the developmental aspects of switching shivering thermogenesis on and nonshivering thermogenesis off remain to be elucidated ( Fig. 7.3 ) ( ). However, the importance of nonshivering thermogenesis seems to decrease rapidly after the first year of life, while at the same time shivering thermogenesis is becoming more and more effective ( ).

Fig. 7.3, A, Brown adipose tissue in superficial and deep sites in the newborn. B, Relation of shivering thermogenesis to nonshivering thermogenesis as it appears in certain newborn animals. The time scale represents human development.

In general, dietary thermogenesis has little effect during anesthesia. It only affects temperature if the patient is given food with a high protein or fructose content prior to or during anesthesia.

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here