Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The preanalytical phase has long been recognized as a source of substantial variability in laboratory medicine. Laboratory errors, mostly due to some defect in the preanalytical phase, may lead to diagnostic errors. Understanding preanalytical variation and reducing errors in the pre-examination phase of the testing process are therefore important for improved safety and quality of laboratory services delivered to patients.
There are numerous preanalytical factors that may affect the concentration of the analyte, the measurement procedure, or the test result. These factors may be divided into two major groups: influencing and interference factors. Influencing factors are effects on laboratory results of biological origin that most commonly occur in vivo but can also be derived from the sample in vitro during transport and storage. Biological influence factors lead to changes in the quantity of the analyte in a method-independent way. Interference factors (interferences) are defined as mechanisms and factors that lead to falsely increased or decreased results of laboratory tests of a defined analyte. Interference factors and their mechanisms differ with respect to the intended analyte and analytical method. Interference factors do not affect the concentration of the analyte. On the contrary, they alter the test result for a specific analyte after the sample has been collected. They are different from the measured analyte and interfere with the analytical procedure. Therefore their effect is method dependent and may thus be reduced or eliminated by selecting a more specific method. This chapter describes the most common preanalytical sources of variability (influences and interferences) and provides recommendations on how to deal with them in everyday practice.
The incidence of premature patient deaths associated with some kind of preventable medical error has been estimated to be 98,000 per year. More recent data indicate that the actual mortality caused by preventable medical errors is fourfold higher. According to the European Commission (EC) and World Health Organization (WHO), 1 in 10 patients is being harmed while receiving hospital care in developed countries. , Errors in laboratory medicine can lead to increased health care expenditure, cause patient harm to various degrees, and lead to different diagnostic errors (i.e., missed diagnosis, misdiagnosis, and delayed diagnosis).
It has been suggested that laboratory test results affect approximately 70% of medical decisions, and this clearly explains why laboratory errors have a large contribution to the overall error frequency in health care. , Almost 40% of diagnostic errors are attributed to some error that has occurred within the area of radiology or laboratory medicine, and the majority of those laboratory errors are due to some defect in the preanalytical phase of the total testing process (TTP).
The preanalytical or, according to ISO 15189 terminology, pre-examination phase of the laboratory testing process has been recognized since the early 1970s as an important source of variability, and it still represents one of the greatest challenges for specialists in laboratory medicine. ,
In the second half of the 20th century, when quality assurance programs were introduced for the analytical processes, laboratories became aware that some factors outside the analytical phase also significantly impacted laboratory results. Results that did not correspond with the patient’s clinical condition have often been called “laboratory errors.” It also became clear that these variables could not be standardized or controlled by analytical quality assurance programs. In the late 1970s, Statland and Winkel defined the phase prior to analysis as the “preinstrumental phase,” which was later changed to the “preanalytical phase.”
Even before the preanalytical phase was recognized as an important issue in laboratory medicine, some experts from different areas of laboratory medicine defined these variables as influencing and interference factors, , which were not immediately recognized as important sources of “laboratory errors.” It took some time for laboratory medicine professionals to gather knowledge about their causes and mechanisms and acknowledge their importance.
After years of discussion within several national and international expert groups in the 1960s and 1970s, the term biological influence factor was introduced and distinguished from interference factors. This led to the definitions established in the 1980s, , which are still valid today.
The preanalytical phase is recognized as the most vulnerable part of the TTP, and it accounts for two-thirds of all laboratory errors. Preanalytical errors can occur at any step of the preanalytical phase—for example, during test requesting, patient preparation, sample collection, sample transport, handling, and storage. This high frequency of preanalytical errors may be attributed to various reasons. Many preanalytical steps are performed outside the laboratory and are not under the direct supervision of laboratory staff. Furthermore, many individuals are involved in various preanalytical steps, and those individuals have different levels of education and professional background. Finally, safe practice standards for many activities and procedures are either not available, or are available but not evidence-based, or the level of compliance with those standards is low.
The ISO 15189 accreditation standard clearly defines that medical laboratories are responsible for the management and quality of the pre-examination phase. It is the role of the laboratorian that the right sample be taken from the right patient at the right time, and that correct test results are provided to the requesting physician in a timely manner. If the quality of the specimen is compromised to a degree where the expected effect is larger than the allowable error, thus causing clinically significant bias, the sample should be rejected for analysis. Our guiding principle should be “No result is always better than a wrong result.” Patient benefit should always be the top priority.
Influencing factors are the effects on laboratory results of biological origin that most commonly occur in vivo but can also be derived from the sample in vitro during transport and storage. Biological influence factors lead to changes in the quantity of the analyte to be measured in a defined matrix. They modify the concentration of the measured (affected) analyte in a method-independent way.
These factors are either present in the healthy individual, like circadian rhythms, or they appear as side effects of a disease and its treatment. Influencing factors may be modifiable, such as diet, time of the day, or time of the year (season), or unmodifiable, such as gender, race, ethnicity, genetic background, and so on. Some modifiable biological influence factors can be controlled by patient action—for example, diet—whereas others—for example, age—are not controllable. Particular care should be taken with the influencing factors whose effects may be reduced through standardization of preanalytical conditions.
As already mentioned, modification of the concentration of certain analytes can also occur in vitro. For example, glucose concentration will decrease during prolonged storage of unseparated blood due to cell metabolism, whereas potassium concentration will increase if blood is kept at lower temperatures or refrigerated (+4 °C). Such increase in potassium will occur even without visible hemolysis.
Interferences are defined as mechanisms and factors that lead to falsely increased or decreased laboratory test results for a defined analyte. Interferences may be endogenous (i.e., biological constituents of the sample) or exogenous. Exogenous interferences occur in the preanalytical phase due to the action of some external factors or conditions that are not normally present in properly collected, transported, handled, and stored specimens. Interference factors and their mechanisms differ with respect to the intended analyte and analytical method, and they alter the result of a sample constituent after the specimen has been collected. They are different from the measured analyte and interfere with the analytical procedure. Therefore their effect is method dependent and may thus be reduced or eliminated by selecting a more specific method.
Possible interferents include the following:
Biological constituents of the sample (e.g., free hemoglobin, lipids, bilirubin, paraproteins, fibrin, fibrin clots, etc.)
Exogenous molecules present in the sample (e.g., drugs, herbal supplements, contrast media)
Exogenous molecules added to the sample during sampling or after the sampling procedure (e.g., anticoagulants, tube additives, intravenous infusions, etc.)
Because interference factors are analyte and method specific, they may be eliminated or at least reduced by changing the measurement method.
Although exogenous preanalytical interferences are not rare, they are often neglected and overlooked in everyday routine work. If they go undetected, preanalytical interferences may cause unnecessary harm to the patient and increase health care–related costs. Some examples of harmful results due to erroneous immunoassay findings are listed in Table 5.1 .
Wrong Result | Consequence |
---|---|
Falsely Increased Concentration | |
High human chorionic gonadotrophin indicating gonadal tumor | Unnecessary surgery, chemotherapy |
High calcitonin indicating medullary thyroid cancer | Unnecessary fine-needle aspiration |
High prolactin | Misdiagnosis of prolactinoma |
High urine free cortisol | Unnecessary diagnostic follow-up |
High testosterone in women | Unnecessary diagnostic follow-up |
High luteinizing hormone and follicle-stimulating hormone | Unnecessary diagnostic follow-up |
Falsely Decreased Concentration | |
Low 25-hydroxyvitamin D result despite replacement therapy | Incorrect diagnosis of hypovitaminosis D |
Negative human chorionic gonadotropin result | Missed diagnosis of choriocarcinoma |
Low digoxin | Wrong treatment (overdosing with digoxin, risk of digoxin toxicity) |
Low insulin | Missed diagnosis of insulinoma |
Negative troponin result | Missed diagnosis of myocardial infarction |
It is very important that laboratory staff has a thorough knowledge and understanding of the assays and instruments in use in their laboratory and potential interferents which may affect laboratory measurements. This chapter provides an overview of the most common preanalytical sources of variability (influences and interferences) and provides recommendations on how to deal with them in practice.
Influencing factors lead to changes in the quantity of the analyte in a method-independent way.
Influencing factors may be changeable (e.g., diet, time of sample collection during the day) or unchangeable (e.g., gender, race, ethnicity, genetic background).
The effect of influencing factors may be reduced through standardization of preanalytical conditions.
Interferences are mechanisms and factors that lead to falsely increased or decreased results of laboratory tests.
Interference factors and their mechanisms differ with respect to the intended analyte and analytical method and may be reduced or eliminated by selecting a more specific method.
The effect of most modifiable influencing factors can be either minimized or even entirely eliminated by standardization of preanalytical processes. Several local and international guidelines provide recommendations for efficient standardization of patient preparation and sample collection. , These documents provide guidance on timing of sampling, diet and activities before sampling, body position and disinfection during sampling, and regulations regarding documentation of these variables for diagnostic and/or therapeutic purposes.
On the other hand, since the effects of unmodifiable factors cannot be eliminated by standardization, they are addressed by assigning appropriate reference intervals (e.g., gender-specific, age-specific, etc.).
Time of sampling matters for all analytes which are subject to substantial biological variation. Changes of the concentration of an analyte due to biological variation may significantly affect the given result of a particular laboratory test and their nature can be either linear or cyclic. Linear changes occur in a chronological order, while cyclic changes are of repetitive nature, such as seasonal changes, or changes due to the menstrual cycle. Knowledge about the time of the sample collection is therefore necessary for correct test result interpretation and as such is an important preanalytical factor which should be carefully considered.
Several analytes tend to fluctuate in terms of their plasma concentration over the course of a day, and for this reason reference intervals are preferentially defined for sampling between 7 and 9 am. For example, the concentration of potassium is lower in the afternoon than in the morning, whereas that of cortisol decreases during the day and increases at night ( Fig. 5.1 ). Furthermore, the cortisol circadian rhythm may well be responsible for the poor results obtained from oral glucose tolerance testing in the afternoon.
For many years, it was believed that iron has a substantial circadian variation, with an early morning peak and a decrease in the afternoon; that was the main reason most of the blood collections for iron were done in the morning. It was only recently demonstrated that iron concentration is quite sustainable throughout the day, up until 3 pm, when it starts to decrease, whereas the lowest iron concentrations were observed with collection times past 4 pm.
In some cases, seasonal influences also have to be considered. For example, total triiodothyronine (T3) is 20% lower in the summer than in the winter, whereas 25 OH-cholecalciferol exhibits higher serum concentrations in the summer than in the winter.
Some analytes can exhibit significant changes due to the hormone biological variations that occur during menstruation cycle. For example, aldosterone concentration in plasma is twice as high before ovulation than in the follicular phase, concentration of renin is increased preovulatory, cholesterol exhibits a significant decrease during ovulation, while the concentration of phosphate and iron decreases during menstruation.
Time of sampling is not only important to eliminate confounding effects of biological variation, but is also extremely important for patients receiving some diagnostic and/or therapeutic procedures which may cause some in vivo (influencing effect, very frequent) or in vitro (interference effect, much less common) effects on laboratory tests. , Some examples of these diagnostic and/or therapeutic procedures are listed below:
Surgical operations
Infusions and transfusions
Punctures, injections, biopsies, palpations, whole-body massage
Endoscopy
Dialysis
Physical stress (e.g., ergometry, exercise, ECG)
Function tests (e.g., oral glucose tolerance test)
Immunoscintigraphy
Contrast media
Drugs, herbal supplements and over-the-counter medicines
Mental stress
Ionizing radiation
Plateletpheresis procedure is used in Transfusion Medicine to obtain platelets needed for treatment of thrombocytopenia. Citrate used in this procedure has chelating effect on ionized calcium and magnesium and along with decreases in some hematology parameters, can also lead to acute ionized hypocalcemia and hypomagnesemia. , Citrate chelation can also lead to decreased ionized calcium concentration in critically ill patients with high risk of bleeding and acute renal failure who are subjected to citrate-anticoagulated continuous veno-venous hemofiltration. In this situation, hypocalcemia reflects citrate overdose. Due to citrate effect, lower concentrations of ionized calcium and magnesium have also been observed in patients receiving high volumes of transfused blood.
Administration of hypertonic saline in patients having severe head trauma with therapeutic target sodium concentration of 155 mmol/L (or mEq/L) and should not be misinterpreted as contamination with saline intravenous fluid.
Iodinated contrast media used in computed tomography (CT) have high osmolality and high iodine content. Application of these iodinated contrast media for imaging purposes in the pediatric population exceeds normal daily intake of iodine, carries a risk of thyroid dysfunction, and prompts close monitoring of pediatric patients after exposure.
Contrast media may also affect some coagulation and inflammatory parameters. For example, ioxaglate and iodixanol, radiographic contrast media used in diagnostic and therapeutic angiography, may inhibit generation of thrombin in studies performed on platelet-poor and platelet-rich plasma (PRP). The effect of contrast media during coronary angiographic procedure on the inflammatory markers interleukin-6 (IL-6) and soluble (s) receptors (R) for tumor necrosis factor alpha (TNFα) sTNFRα1 and sTNFRα2 has also been reported. Ioxaglate causes the increase in inflammatory markers after contrast media administration, and this effect is more pronounced after the administration of ionic (ioxaglate) compared to nonionic (iohexol and iodixanol) contrast media.
Many drugs, herbal supplements, and over-the-counter medicines may cause various in vivo changes of the composition of the blood and subsequently influence the measurement of many laboratory parameters. For example, long-term treatment with proton pump inhibitors leads to the increase in the concentration of the neuroendocrine tumor marker chromogranin A by stimulating enterochromaffine-like cells. In order to avoid unnecessary diagnostic procedures, it is therefore advised to cease proton pump inhibitors at least 2 weeks before chromogranin A testing.
Another example of the drug which exerts in vivo influencing effect is azithromycin. In patients on azithromycin, there is a risk of the occurrence of drug-induced immune thrombocytopenia. Alemtuzumab infusion to patients with active relapsing-remitting multiple sclerosis has remarkable effect on several hematology and biochemistry tests. Cross-sectional data analysis of the Rotterdam study on 9820 participants has demonstrated the significant association of thiazide diuretics use with the increased risk of hypomagnesemia. Trimethoprim, a drug commonly prescribed together with other antibiotics for urinary tract infection treatment, may cause reversible increase in serum creatinine concentration; this increase affects calculation of estimated glomerular filtration rate. Lack of information about this in vivo effect of trimethoprim can cause misinterpretation and erroneous clinical decision.
The influencing effect of herbal supplements is exerted through toxicity or enzyme induction. Since herbal supplements are categorized as dietary supplements, they are not subject to strict regulations like drugs. Such permissive regulation carries a significant risk due to the uncontrolled use of herbal supplements alone or in combination with other supplements or drugs. Obviously, there is a need to increase the level of awareness about potential risks associated with the use of herbal supplements.
Kava is a traditional medicinal substance used in the Pacific region. It has relaxing effect and is consumed to treat anxiety, as an aqueous nonalcoholic drink made of kava rhizome. Cases of heavy kava consumption are associated with the 70- and 60-fold increase of alanine aminotransferase (ALT) and aspartate aminotransferase (AST) activity, as well as largely increased alkaline phosphatase (ALK), γ-glutamyltransferase (GGT), lactate dehydrogenase (LD), and total and conjugated bilirubin concentration and in extreme cases even with fulminant hepatic failure.
The most probable underlying mechanism of hepatotoxicity is related to metabolic interaction of alcohol with kava, although multiple factors, involving genetic defects in hepatic metabolism, contribute to development of extreme reactions. Some other dietary supplements such as LipoKinetix and Centella asiatica which are largely used for weight loss, may also cause hepatotoxicity and even lead to fulminant hepatic failure associated with extreme increase of liver enzymes, which can be resolved after discontinuation of consumption. ,
Kelp, a kind of seaweed used in Asia as a selenium supplement, is rich in iodine. Patients taking kelp commonly have high serum and urine iodine concentration even if on low-iodine diet, which is mandatory for radioiodine therapy. ,
To prevent the confounding effect of these factors, samples should always be collected before any diagnostic or therapeutic procedure with potential influencing or interfering effects. Likewise, drugs exerting influencing or interfering effects should be administered exclusively after collecting a blood sample, if not advised differently by the requesting physician (Note: time of sampling for therapeutic drug monitoring is discussed in Chapter 42 ).
For all the above-mentioned reasons, and given the fact that in some circumstances a sample taken at the wrong time might be worse than taking no sample, exact time of sample collection always needs to be provided to the laboratory.
For effective standardization of the time of sampling, laboratories should ensure that:
the best time of sampling for each analyte is known (taking into account how the concentration of the particular analyte changes over time),
blood is always taken at the recommended time,
the exact time of sampling is known for each sample and is recorded into the laboratory information system (LIS) by the health care staff,
patients and the health care staff are educated about when blood samples should be collected for laboratory testing, as well as about the importance of the time of blood sampling and its effect on laboratory test results.
Prior to blood sampling, the confounding influences of food and fluid intake should be excluded. Diet and fluid intake substantially affect the composition of plasma. Differences in serum composition may occur respective to the source of nutrients, number of meals, and proportion of nutrients in a diet. Moreover, malnutrition or obesity, prolonged fasting, starvation, and vegetarianism may also influence plasma composition. The effects from diet can be divided into long-term and acute effects.
It is well known that changes in protein intake that occur over a couple of days may affect the composition of nitrogenous components of plasma and the excretion of end products of protein metabolism. Creatinine is an important example of the effect of diet on the composition of plasma. It has been shown that an increase of up to 20% of plasma creatinine concentration (measured by kinetic Jaffe method) is observed after ingesting cooked meat. Protein-rich food affects not only the concentration of serum creatinine but also the concentration of urea and urate in serum.
A diet rich in fat leads to increased serum triglyceride concentration, reduced serum urate, and a depletion of the body’s nitrogen pool. The nitrogen pool is affected because excretion of ammonium ions is required to maintain acid-base homeostasis. , The relative ratio in which various dietary fats are consumed closely relates to serum lipid concentrations. A diet rich in monounsaturated and polyunsaturated fats causes a reduction of low-density lipoprotein (LDL) and high-density lipoprotein (HDL) cholesterol concentrations, although in some situations HDL cholesterol may be increased.
A diet rich in carbohydrates decreases serum protein and lipid concentrations (triglycerides, and total and LDL cholesterol). It should be emphasized that not only the proportion but also the source of nutrients in the diet affect the composition of serum. For example, some early studies have shown that serum ALP and LD activities are higher, whereas AST and ALT activities are lower in individuals who consume carbohydrates rich in sucrose or starch rather than other sugar types. Moreover, total, LDL, and HDL cholesterol concentrations tend to be much lower in those who consume the same amount of food in many small meals throughout the day than in individuals who eat three meals per day.
Compared to omnivorous subjects, vegetarians tend to have lower concentrations of plasma cholesterol, triglycerides, and creatinine, with reduced urinary excretion of creatinine and a higher urinary pH as a result of reduced intake of precursors of acid metabolites. In malnourished individuals, the activity of most of the commonly measured proteins and enzymes is reduced. , Most of the above-described changes normalize following the restoration of good nutrition.
While it is clear that many analytes are affected by acute ingestion of food, the direction and magnitude of the change still remain largely unclear, mainly due to substantial differences in the design of the original studies published so far. Some of the methodological aspects of these studies which might be responsible for the observed differences are: time of blood collection (morning, afternoon, etc.), time intervals at which blood collection was done (i.e., length of time after the meal), sample type (serum, plasma), assay (method principle, manufacturer), measurement equipment, baseline concentration of the analyte, measurement unit (i.e., mmol/L, mg/dL), type of the meal, food composition, and other patient-related characteristics (e.g., health status, age, gender, ethnicity, physical activity, smoking status, and consumption of alcohol and coffee). , Moreover, while some postprandial effects are a result of the in vivo physiologic changes, some effects occur due to the interfering effect of sample turbidity caused by the increase of triglyceride (chylomicrons) concentration, and as such are method- and instrument-dependent. Table 5.2 shows the maximal postprandial effects observed anywhere within 1 to 4 hours after a meal on some most common chemistry analytes and hormones.
Guder, 2009 | Lima-Oliveira, 2012 | Kackov, 2013 | Bajana, 2019 | |
---|---|---|---|---|
Type of meal | Standard meal | Light meal, containing standardized amounts of carbohydrates, protein, and lipids. | Standardized High-calorie meal (823 kcal) |
Andean breakfast, containing standardized amounts of carbohydrates, protein, and lipids. |
Blood collection time points | Baseline and 2 h after the meal | Baseline and 1 h, 2 h, 4 h after the meal | Baseline and 3 h after the meal | Baseline and 1 h, 2 h, 4 h after the meal |
Analyte | ||||
Triglycerides | 78% | 28% | 71.4% | 85% |
CRP | NA | 25% | −5% | 6% |
Urea | 0% (no change) | −4% | NA | 26% |
Creatinine | NA | −2.2% | NA | 33% |
AST | 25% | 14% | NA | 5% |
ALT | 5.5% | 18% | NA | 4.3% |
Albumin | 1.8% | 3.4% | NA | 4.4% |
Bilirubin (total) | 16% | −16% | NA | −29% |
Bilirubin (direct) | NA | −24% | NA | −29% |
Calcium | 1,6% | 3,5% | NA | 4% |
Magnesium | NA | 3,4% | NA | 9% |
Iron | NA | 10% | NA | −35% |
Potassium | 5.2% | 5.8% | NA | 3% |
Uric acid | NA | −5% | NA | −3.6% |
TSH | NA | NA | NA | 27% |
fT4 | NA | NA | NA | 6.6% |
Cortisol | NA | NA | NA | −29% |
Certainly, not all changes are clinically significant, but for those analytes for which postprandial changes are clinically significant, fasting prior to blood collection is recommended to overcome this problem. The most pronounced change after a recent meal among chemistry tests is observed in triglycerides. Triglyceride concentrations in serum increase almost twofold during the absorptive phase, within 1 to 2 hours after the meal, and the magnitude of the increase obviously depends on the type of meal and time of sample collection after the meal.
Whereas the nature of the change for most analytes is mostly unidirectional, some analytes, like phosphate, exhibit a characteristic bi-phase change. Concentration of phosphor initially drops for −2.7 to −8% within 1 hour after the meal and is followed by an increase of up to 12.6% 4 hours after the meal. ,
It is noteworthy to point out that not only chemistry tests but also some hormones (e.g., thyroid stimulating hormone [TSH], free thyroxine [fT4], cortisol, insulin) are significantly affected by acute food ingestion.
Acute food ingestion also affects hematology and coagulation parameters. Tables 5.3 and 5.4 show the maximal postprandial effects observed anywhere within 1 to 8 hours after a meal on complete blood count (CBC) and coagulation parameters.
van Oostrom, 2003 | Lippi, 2010 | Kos´cielniak, 2017 | Arredondo, 2019 | |
---|---|---|---|---|
Type of meal | Standardized oral-fat loading test. | Light meal, containing standardized amounts of carbohydrates, protein, and lipids. | Light meal, containing standardized amounts of carbohydrates, protein, and lipids (300–700 kcal). | Chilean breakfast, containing standardized amounts of carbohydrates, protein, and lipids. |
Blood collection time points | Baseline and 2 h, 4 h, 6 h, 8 h after the meal | Baseline and 1 h, 2 h, 4 h after the meal | Baseline and 1 h, 2 h after the meal | Baseline and 1 h, 2 h, 4 h after the meal |
Analyte | ||||
WBC | NA | NA | 16% | 16.9% |
Neutrophils | 42% | 7.6% | 37% | 27.4% |
Lymphocytes | 42% | −18.7% | −12% | 15.9% |
Monocytes | No change | −6.9% | No change | 25.0% |
Eosinophils | NA | −23.2% | No change | No change |
RBC | No change | −3.3% | −7% | −3.4% |
Hgb | NA | No change | −8% | −2.7% |
Hct | NA | −3.9% | −6% | −4.4% |
MCV | NA | No change | No change | −2.1% |
MCH | NA | 1.6% | No change | No change |
Plt | NA | No change | −6% | 6.9% |
MPV | NA | −2.3% | No change | −8.5% |
Analyte | Lima-Oliveira, 2014 | Arredondo, 2019 |
---|---|---|
Activated partial thromboplastin time (aPTT) | −6.2 | −4.5 |
Fibrinogen | No change | −3.1 |
Antithrombin III | 3.7 | 1.8 |
Type of meal | Light meal, containing standardized amounts of carbohydrates, protein, and lipids (563 kcal). | Chilean breakfast, containing standardized amounts of carbohydrates, protein, and lipids. |
Blood collection time points | Baseline and 1 h, 2 h after the meal | Baseline and 1 h, 2 h, 4 h after the meal |
There is an evident postprandial increase in neutrophil count, along with a decrease in red blood cell (RBC) count, as well as some RBC indices. Variations in RBC count and indices are most likely attributable to hemodilution caused by the ingestion of food and fluids. Postprandial neutrophil increase is suggested to play a role in the pathogenesis of atherosclerosis. The effect of acute ingestion of food and fluid on lymphocyte and platelet count is less clear and is most likely instrument-dependent.
Postprandial triglyceridemia causes a transient increase of the plasma levels of the activated factor VII (FVIIa) and plasminogen activator inhibitor (PAI-1); mechanism of this phenomenon is still not completely understood. , As FVIIa is the first enzyme of the blood coagulation system, postprandial phase fluctuations can trigger the coagulation cascade and significantly change some of the plasma coagulation parameters. It should be emphasized that activated partial thromboplastin time (aPTT) is shortened in the postprandial state and this is why monitoring of unfractionated heparin could be jeopardized if samples are taken in a nonfasting state. Considering the above, a period of fasting is required before hemostasis testing.
Finally, the human body experiences a mild postprandial metabolic alkalosis in response to a meal. This alkalosis occurs due to the secretion of the hydrochloric acid in the parietal cells of the stomach, which is followed by extraction of chloride from the plasma and the release of bicarbonate into the plasma in order to maintain electrical neutrality. Thus venous blood leaving the stomach is enriched with bicarbonates, and this phenomenon is responsible for postprandial metabolic alkalosis (i.e., the alkaline tide) with concomitant increase of pCO 2 and a subsequent reduction of ionized calcium by 0.2 mg/dL (0.05 mmol/L).
To avoid any misinterpretation due to the above-described effects, blood collection should preferably be done after an overnight (12 hours) fast (discussed in more details later in the section entitled Preparing for Blood Sampling).
Whereas drinking coffee or small amounts of alcohol is largely seen as part of normal life and therefore not worth reporting to the physician, one should be aware of the influence of the intake of various fluids on the concentration of different analytes. Ingestion of various fluids may also exert acute and chronic effects.
Many beverages, such as tea, coffee, and cola drinks, contain caffeine. Caffeine stimulates the adrenal cortex and medulla, leading to the subsequent increase of the concentration of catecholamines and their metabolites, as well as free cortisol, 11-hydroxycorticoids, and 5-hydroxyindoleacetic acid (5-HIAA) in serum. These hormonal changes are followed by the increase in plasma glucose concentration. Plasma renin activity may also be increased following caffeine ingestion. , Caffeine induces diuresis and inhibits the reabsorption of electrolytes, thus leading to a transient increase in their excretion and this effect is dose-dependent. Total urine output of water and electrolytes (calcium, magnesium, sodium, chloride, potassium) increases within 2 hours following caffeine ingestion, and caffeine-induced urinary loss of calcium and magnesium is therefore largely attributable to a reduction of the renal reabsorption of calcium and magnesium. Caffeine also has a marked effect on lipid metabolism. Ingestion of coffee increases the rate of lipid catabolism, thus leading to an increase of plasma free fatty acids, glycerol, and lipoproteins. , Finally, caffeine is a strong stimulant of gastrin release and gastric acid secretion and also induces the secretion of pepsin.
Alcohol consumption, depending on its duration and extent, may affect a number of analytes. Among alcohol-related changes, acute and chronic effects should be considered separately. The decrease of plasma glucose and increase of lactate are the acute effects that occur within 2 to 4 hours of ethanol consumption. Ethanol is metabolized to acetaldehyde and then to acetate. This increases hepatic formation of uric acid and inhibits renal urea excretion, thus causing an increase of uric acid in plasma. Together with lactate, acetate decreases plasma bicarbonate, resulting in mild to severe metabolic acidosis, depending on the amount of ingested alcohol.
Acute alcohol ingestion increases the activity of serum GGT and some other enzymes (e.g., isocitrate dehydrogenase, ornithine carbamoyltransferase). Chronic effects of ethanol ingestion include the increase in serum triglyceride concentration due to decreased plasma triglyceride breakdown and an increase in the serum activity of many enzymes (GGT, AST, and ALT).
Moreover, chronic alcohol consumption affects pituitary and adrenal function and is associated with numerous biochemical abnormalities. , It affects lipid metabolism and inhibits the sialylation of transferrin that leads to increased serum concentration of carbohydrate-deficient forms of transferrins (CDT). Increased mean corpuscular volume (MCV) is related to the direct toxic effect of alcohol on erythropoietic cells or a deficiency of folate. Increased urine ethanol excretion leads to a decreased formation of vasopressin with increasing diuresis. Enhanced diuresis is followed by increased secretion of renin and aldosterone.
To assess the effect of alcoholic drinks on test results and to avoid misinterpretation of laboratory results, it is recommended that the history of alcohol intake (i.e., the ingested amount and frequency/time of ingestion) be documented in clinical records.
Smoking tobacco leads to a number of acute and chronic changes in analyte concentrations, with the chronic changes being rather modest. Smoking increases the serum concentrations of fatty acids, epinephrine, free glycerol, aldosterone, and cortisol. These changes occur within 1 hour of smoking a cigarette. Through adrenal gland stimulation, nicotine causes the increase of the concentration of epinephrine in the plasma and the urinary excretion of catecholamines and their metabolites. Smoking leads to the acute increase in serum triglyceride, and total and LDL cholesterol concentrations. Glucose metabolism is also dramatically affected by nicotine. Within only 10 minutes of smoking a single cigarette, glucose concentration increases by up to 10 mg/dL (0.56 mmol/L). This increase may persist for 1 hour.
Alterations in analytes induced by chronic smoking affect numerous blood components such as CBC, some enzymes, lipoproteins, carboxyhemoglobin, hormones, vitamins, tumor markers, and heavy metals ( Fig. 5.2 ). These changes are induced by nicotine and its metabolites and reflect pathophysiologic responses to toxic effects. To avoid a risk of misinterpretation of laboratory test results, smoking habits should be documented in clinical records.
In heavy smokers blood leukocyte count may be increased by as much as 30%, with a proportional increase of the lymphocyte count. For carcinoembryonic antigen (CEA), different reference limits should be applied for smokers and nonsmokers due to the large differences between the two groups. The higher concentration found in smokers is caused by an increased synthesis and secretion of CEA in the colon. Tobacco smokers have higher carboxyhemoglobin concentration. To compensate for the impaired capacity for oxygen transport in heavy smokers, there is also an increase in RBC count. Partial pressure of oxygen (pO 2 ) is lower in tobacco smokers than in nonsmoking individuals by about 5 mm Hg (0.7 kPa). Like caffeine, nicotine is also a very potent stimulant of the secretion of gastric juice and an inhibitor of duodenal bicarbonate secretion. These effects may be observed within 1 hour of smoking several cigarettes. Smoking also affects the body’s immune response and male fertility by affecting the sperm count, morphology, and motility. , The effect of smoking may persist even after smoking cessation. It usually takes 5 years, or even longer, for most parameters to normalize (e.g., C-reactive protein [CRP] and fibrinogen concentrations, hematocrit). Interestingly, for some parameters (e.g., white blood cell [WBC] count), it may take up to 20 years to return to baseline value.
Body posture influences blood constituent concentrations. This is caused by the net capillary filtration (i.e., the net result of the differences in the membrane permeability, hydrostatic pressure, colloid osmotic pressure of plasma, and interstitial fluid). Capillary filtration is especially increased in the lower extremities when changing from the supine to the upright position. The change in body posture from the supine to sitting and from sitting to the upright position leads to a significant decrease in plasma volume with a subsequent increase in the concentration of all constituents that usually do not pass the capillary filtration barrier (e.g., blood cells, large molecular weight molecules). Although this effect is observed in healthy and diseased individuals, the degree of the change is usually greater in some disease states—for example, in cardiac insufficiency.
Variations in the plasma volume subsequent to the change of the body position alter blood cell count (RBC, WBC, and platelets), concentrations of hemoglobin, and hematocrit; a short period of 10 minutes is usually enough for the vascular volumes to re-equilibrate and to adapt to the new posture. It was also demonstrated that patient posture might have a significant impact on results of routine hemostasis testing, decreasing the prothrombin time (PT) values, and increasing the fibrinogen concentration when patient position is changed from supine to sitting. Finally, net capillary filtration effect due to the change in body posture also affects small molecular weight molecules which are transported in blood bound to proteins. For example, while the concentration of free calcium is not affected, total calcium concentration increases by 5 to 10% when changing from the supine to the upright position. To minimize the effect of this preanalytical source of potential bias, reference intervals should ideally be obtained under identical conditions with regard to body posture. Blood sampling should be performed after at least 15 minutes of rest in a supine or sitting position.
A similar mechanism occurs when a tourniquet is applied to facilitate finding appropriate veins for venipuncture. The higher pressure obtained in veins leads to the loss of water and low molecular weight substances, increasing the concentration of proteins and analytes bound to them, cells, hemoglobin concentration, and hematocrit. , This becomes clinically significant after 1 to 2 minutes of tourniquet application. Prolonged venous stasis can also cause a significant increase of fibrinogen and a shortening of aPTT and PT. Therefore the tourniquet should be released 1 minute after it has been applied.
Physical activity of varying duration and intensity may lead to substantial changes in the plasma composition, and the extent of this change depends on several factors, such as training status, intake of fluid, electrolytes and carbohydrates, and even the ambient temperature. , For example, even a mild physical effort, like clenching the fist during venous blood sampling, can increase the concentration of potassium and should therefore be avoided. This occurs due to the release of potassium from skeletal muscles and even without a tourniquet.
Intensive exercise is associated with transient increases in cardiac biomarkers, markers of muscle damage, platelet aggregation, tissue-plasminogen activator, activation of the fibrinolytic system, and a decrease in the ability of the blood to clot and generate thrombin, as well as with leukocytosis. Cardiac troponin (cTn) rises after a maximal bicycle stress test. The majority of changes are of transient nature and most of the parameters return to baseline within 3 hours after the exercise, although it was observed that some hematologic indices, such as red cell distribution width (RDW), continue to increase after the half-marathon run, reaching a peak 20 hours after the run. Furthermore, it has been demonstrated that in individuals who are physically active more than 12 hours per week, concentrations of creatine kinase (CK), Creatine kinase MB (CK-MB), ALT, and LD are increased for a prolonged period of time.
Due to such substantial changes in plasma composition, in professional athletes (e.g., marathon runners), a large proportion of laboratory results may fall outside the usual reference intervals.
Intensive physical activity (within 12 hours before blood sampling) may also affect homeostasis for numerous hormones including catecholamines and their derivatives, epinephrine, norepinephrine, dopamine, corticotropin (ACTH) and vasopressin, gastrin, TSH, prolactin, growth hormone, aldosterone, cortisol, testosterone, human chorionic gonadotropin (hCG), insulin, glucagon, and β-endorphin.
Because food, fasting time, circadian rhythm, muscular activity, smoking, drugs, and ethanol consumption can affect the concentration of numerous analytes, standardization of all those controllable variables is highly recommended. Proper standardization of controllable variables leads to significant reduction of preanalytical variability. In the past, there has been a great heterogeneity in the definition of fasting state used for different analytes by different health care facilities and in the literature. To facilitate the agreement on the definition of fasting state and encourage uniform and consistent compliance the European Federation for Clinical Chemistry and Laboratory Medicine (EFLM) Working Group for Preanalytical Phase WG-PRE has published a recommendation for the definition of fasting requirements as a guiding framework for harmonization of this important preanalytical aspect.
According to these recommendations, the following general requirements should be applied to all blood tests:
Blood should be drawn preferably in the morning between 7 am and 9 am
Fasting should last for 12 hours, during which only water consumption is permitted.
Alcohol should be avoided for 24 hours before blood sampling.
In the morning before blood sampling, patients should refrain from cigarette smoking and caffeine-containing drinks (tea, coffee, etc.).
Professional associations and laboratories worldwide are encouraged to adopt, implement, and disseminate the EFLM WG-PRE recommendation for the definition of fasting . Moreover, laboratories worldwide should have policies for sample acceptance criteria related to fasting samples. Blood samples for routine testing should not be taken if a patient has not been appropriately prepared for sample collection.
Various unavoidable biological factors can lead to changes in analyte concentration and can therefore only be considered during interpretation with the respective knowledge. Table 5.5 summarizes some of these factors and their respective effects. These factors should be considered when interpreting laboratory results because their influence cannot be prevented by preanalytical standardization.
Influence (Reference) | Examples of Analyte Concentrations Changed | Remarks |
---|---|---|
Age | ALP, LDL cholesterol, hormones, creatinine, total WBC count, WBC subpopulations, RBC, hemoglobin, hematocrit, RBC indices, VWF, AT, PC, PS, plasminogen. | Provide age-dependent reference intervals |
Race | CK higher in black than in white males. Creatinine higher in black than in white males. Granulocytes higher in white than in black males. Hematocrit, hemoglobin, and MCV lower in African Americans than Caucasians. Hematocrit, hemoglobin, MCH, MCHC, and MPV lower in Asians than Caucasians. | Provide race-specific reference intervals |
Gender , , | ALT, γ-GT, creatinine, hemoglobin, hematocrit, RBC, WBC, PLT | Provide gender-specific reference intervals |
Pregnancy , , | Triglycerides ↑, homocysteine ↓, WBC ↑, d-dimers ↑, PT ↑, fibrinogen ↑ | Document months of pregnancy with laboratory results |
Altitude , | CRP, hemoglobin ↑, hematocrit ↑, RBC ↑, transferrin↓ | Consider weeks of adaptation, when coming from or going to high altitude |
Due to dramatic physiologic changes associated with growth and development, the reference intervals for many analytes differ substantially with respect to an individual’s age and gender (see Chapter 9 and the Appendix). In newborn subjects, the body fluids reflect the trauma of birth and early postnatal events related to the adaptation of the baby to new extrauterine life. Immediately after birth, infants usually experience a mild metabolic acidosis of transient nature, due to the accumulation of lactates. This acid-base disturbance is usually normalized within the first day after birth. The CALIPER study is an excellent source of reference intervals in childhood (see the Appendix). In the early hours of extrauterine life, the concentration of some biochemical markers (AST, direct bilirubin, total bilirubin, creatinine, CRP, GGT, immunoglobulin G [IgG], LD, magnesium, phosphate, rheumatoid factor, uric acid) is increased, thus reflecting the maternal concentrations, but it then declines within the first 2 weeks of life. Concentrations of other markers (e.g., amylase, transferrin, antistreptolysin O [ASO], cholesterol, IgA, IgM) are very low in the neonatal period and gradually increase within the first 2 weeks of extrauterine life. This upward trend in analyte concentrations continues over time from birth to 18 years. Most of the biochemistry parameters (albumin, ALP, AST, total bilirubin, creatinine, IgM, iron, lipase, transferrin, HDL cholesterol, and uric acid) exert differences between genders during the early childhood years. However, these changes are most significant during puberty (age 14 to 18 years), due to the strong influence of sexual development and growth.
Hemoglobin concentration, hematocrit, and the other RBC indices follow a similar pattern, showing the gradual increase during the first 10 years of life. First gender differences are observed at the age of 10 years, when values in boys show a sharp increase during puberty and adolescence. Concentrations in females are much lower, but they also slowly increase throughout puberty. Such gender differences are related to the lower metabolic demand, decreased muscle mass, and lower iron stores in females.
Concentration of thrombopoietin peaks shortly after birth and then slowly decreases. Subsequent to the change of thrombopoietin concentration, immediately after birth there is a peak in platelet count, followed by a decline during childhood and into adulthood. The WBC count is also higher in the early extrauterine days and throughout the first couple of years of childhood; values decline in older children. Females have slightly higher platelet count than males during adolescence and adulthood.
Although bone marrow cellularity decreases with age, in the absence of disease WBC, hemoglobin, platelets, and differential are maintained within adult reference intervals in individuals older than 65 years. ,
Hemostasis develops during fetal development and changes with gestational age. In neonates, the concentrations of the proteins of the prothrombin and contact factor groups are lower than in adults, due to liver immaturity, and reach adult values only after 6 months of age.
Samples should be taken before any therapeutic and diagnostic procedures that have a potential influencing effect.
Tobacco smoking leads to several acute and chronic changes in the concentrations of numerous analytes.
Even within only 1 hour of smoking one to five cigarettes, there is an increase in serum concentration of fatty acids, epinephrine, free glycerol, aldosterone, and cortisol.
Diet substantially affects the composition of plasma. The effects of diet can be long term and acute.
Physical activity of varying duration and intensity leads to changes in the plasma composition of many analytes. The extent of this change depends on training status, intake of liquid, electrolytes and carbohydrates, and even the ambient temperature.
Most of the reference interval data for children are obtained from the CALIPER study.
As mentioned earlier, interference factors have the ability to interfere with the analytical procedure and alter the test results. The effect of interference factors depends on the method—that is, the same interferent may not necessarily affect two different methods used to measure the same analyte. Common interference factors are hemolysis, lipemia, icterus, drugs, paraproteins, and various sample contaminants such as gels, tube additives, and fibrin clots.
Interfering factors are considered clinically relevant when the bias caused by their interference is greater than the maximum allowable deviation of a measurement procedure. How this “maximum allowable deviation” should be established is still debated. The Clinical Laboratory Standards Institute (CLSI) EP7-A2 guideline, for example, sets this criterion at ±10% as a rule of thumb. Others would argue that the degree of allowable deviation caused by interfering factors should be derived (I) from data on the biological variation of the analyte, (II) by simulation modeling based on the effect of preanalytical and analytical performance on clinical decisions or patient outcomes, or (III) from information on the state-of-the-art. The choice of the method for determining the maximum allowable deviation for a certain analyte not only depends on the medical use of the test but also on the national and international regulations in use.
Interferences can be endogenous and exogenous. Endogenous interferences originate from the substances present in the patient sample, whereas exogenous interferences relate to the effect of various substances added to the patient sample, such as separator gels, anticoagulants, surfactants, and so on, all of which may cause significant interference. ,
Hemolysis is defined as a process of membrane disruption of erythrocytes and other blood cells, accompanied by the subsequent release of cell components into the plasma and red coloration of the serum (or plasma) to various degrees after centrifugation. , Though hemoglobin is the most abundant protein in RBC, hemolysis is not necessarily always associated with the release of hemoglobin into the surrounding extracellular fluid. For example, if the blood sample is stored at a low temperature, low molecular intracellular components like electrolytes diffuse from the cells, but hemoglobin will not. Furthermore, efflux of cell components due to cell lysis affects all blood cells (i.e., platelets and WBC) and not only erythrocytes. Therefore it is important to remember that red coloration of the serum or plasma can never accurately predict the concentration of blood cell components.
Hemolysis is the most common preanalytical error and the most common cause of sample rejection. It occurs with a frequency of up to 30% , and accounts for almost 60% of unsuitable specimens. The frequency of hemolysis largely depends on the collection facility, characteristics of the patient population, and the type of professional who is doing the phlebotomy. The highest frequency of hemolysis has been observed in samples from emergency departments, pediatric departments, and intensive care units, whereas hemolysis has proven to be the least frequent in outpatient phlebotomy centers, where blood sampling is done by specialized laboratory staff. , These differences are due to the level of knowledge and skills of the staff who perform the blood collection. One large study in Australia of five hospitals from October 2009 to September 2013 found that the hemolysis rate is much higher in emergency departments (up to 8.73%, depending on the triage category) than in other inpatient settings (<4%). Interestingly, the hemolysis rate was highest in patients who were triaged in the most urgent category. Also, the hemolysis rate was higher if the phlebotomy was done by the clinical staff than by laboratory phlebotomists.
The two major sources of hemolysis are in vivo hemolysis and in vitro hemolysis. In vivo hemolysis is a result of a pathologic condition and occurs within the body before the blood has been drawn. It may occur as a result of numerous biochemical (enzyme deficiencies, erythrocyte membrane defects, hemoglobinopathies), physical (prolonged marching, drumming, prosthetic heart valves), chemical (ethanol, drug overdose, toxins, snake venom), or immunologic (autoantibodies) mechanisms, and infections (babesiosis, malaria). In vivo hemolysis can further be categorized as intravascular and extravascular, depending on the site of the destruction of RBC. Intravascular hemolysis occurs as a direct and immediate disruption of RBC due to the cell injury within the vasculature, whereas in extravascular hemolysis, RBC membranes are damaged by the reticuloendothelial system, primarily in the spleen. The most common causes of in vivo hemolysis are reaction to incompatible transfusion and autoimmune hemolytic anemia.
In vivo hemolysis is not very common and accounts for only 3% of all hemolyzed samples. Nevertheless, in vivo hemolysis is of great clinical importance because it reflects an underlying pathologic process in a patient. Laboratories should therefore have a procedure in place for distinguishing in vivo and in vitro hemolysis. In vivo hemolysis should always be suspected when patient blood is hemolyzed over a longer period after different types of samples (e.g., citrate, serum, and heparinized tube) are hemolyzed or repeated blood sampling, even after special care has been taken to avoid hemolysis.
Common findings associated with in vivo hemolysis which may help in distinguishing in vivo from in vitro hemolysis:
dark brown serum/plasma and urine,
↓↓↓ serum/plasma haptoglobin,
hemoglobinuria (free hemoglobin in urine) and methemoglobinuria,
↑indirect bilirubin concentration in serum/plasma,
↑reticulocyte count (compensatory bone marrow response),
normal potassium concentration in serum/plasma,
↑↑ LDH in serum/plasma,
Decreased concentrations of haptoglobin in serum and free hemoglobin in urine are the most pronounced and specific laboratory signs of in vivo hemolysis. Haptoglobin is a protein that binds free hemoglobin in the circulation to prevent oxidative damage induced by hemoglobin. Once released from the erythrocyte into the plasma, hemoglobin forms complexes with haptoglobin, and those complexes are removed from the circulation by macrophages. In more pronounced cases of in vivo hemolysis, haptoglobin in serum can be undetectable (i.e., below the detection range), whereas its concentration in cases of in vitro hemolysis remains unchanged. , When in vivo hemolysis is confirmed, the laboratory should not reject hemolyzed samples for analysis, because parameters in hemolyzed samples reflect the actual patient condition and are extremely relevant for adequate patient care (diagnosis, therapy management, monitoring).
In vitro hemolysis occurs outside the patient at many steps of the preanalytical phase: blood sampling, sample handling and delivery to the laboratory, and sample storage. Causes of in vitro hemolysis are described in Chapter 4 .
Hemolysis is an endogenous interference that causes clinically relevant bias of patient results through the several distinct mechanisms described in the following.
Spectrophotometric interference of hemolysis occurs due to the ability of hemoglobin to absorb light at 415-, 540-, and 570-nm wavelengths. This characteristic of hemoglobin causes optical interference that can lead to either falsely increased or decreased concentrations of the measured parameters. The direction and degree of the interference largely depend on the analyte and the method.
Some components are present in blood cells in concentrations that are several times higher than those in the extracellular space (i.e., plasma or serum). Table 5.6 shows some of the most pronounced differences between intracellular and extracellular concentration in RBC.
Analyte | Intracellular Concentration (Compared to Extracellular) |
---|---|
Lactate dehydrogenase | ↑ 160× |
Inorganic phosphate | ↑ 100× |
Potassium | ↑ 40× |
Aspartate aminotransferase | ↑ 40× |
Folic acid | ↑ 30× |
Alanine aminotransferase | ↑ 7× |
Magnesium | ↑ 3× |
From this it follows that there is a dramatic increase in the concentration of the listed analytes measured in hemolyzed plasma (or serum) due to the efflux of those substances from erythrocytes into the sample. The most pronounced effect of hemolysis is seen for LD. LD activity may be increased by over 20% in mildly hemolyzed samples (at a concentration of only 0.27 g/L of free hemoglobin), by over 60% at 0.75 g/L of free hemoglobin, and up to over 350% in grossly hemolyzed samples with 3.34 g/L of free hemoglobin.
Because intracellular components may also escape from platelets during clotting, there is a marked difference in the potassium concentration between serum and plasma. The mean estimated difference in the concentration of potassium in serum and plasma is 0.36 ± 0.18 mmol/L, and this difference is positively associated with the platelet count. Plasma is therefore the recommended sample type for the accurate measurement of potassium.
Some analytes are present in much higher concentrations in plasma than in blood cells like albumin, bilirubin, glucose, sodium, and a few others. For those parameters, hemolysis will cause a dilution effect, and their concentrations will be lower in hemolyzed samples. The effect of sample dilution causes clinically significant bias only at higher degrees of hemolysis. For example, glucose is negatively affected by severe hemolysis (−8.3%) only at the concentration of 3.34 g/L of free hemoglobin if measured by the Beckman Coulter chemistry analyzer and reagents (Olympus AU2700, Beckman Coulter, O’Callaghan’s Mills, County Clare, Ireland).
Various blood cell components may affect the analyte measurement procedure by directly or indirectly modifying the analyte ( Table 5.7 ).
Direct mechanism | Indirect mechanism |
---|---|
|
|
An example of the direct interference through competition is the effect caused by the enzyme adenylate kinase, which is present in both erythrocytes and platelets. Adenylate kinase (EC 2.7.4.3) is an enzyme that catalyzes the reversible conversion of ATP and AMP to two ADP molecules and maintains the adenine nucleotide cell content. When released from the cells during hemolysis, adenylate kinase may compete for ADP with CK in a CK assay if inhibitors are not supplied in the reaction mixture.
Hemoglobin released from erythrocytes during hemolysis may interfere with various assays through its pseudo-peroxidase activity. Pseudo-peroxidase activity of free hemoglobin released from erythrocytes interferes in the assay for measurement of bilirubin concentration through the inhibition of the formation of diazonium salt.
Hemolysis may cause a clinically significant interference on a wide range of analytes in immunochemistry assays. This interference is caused by modifying the reaction analytes (antigens and antibodies) by the proteolytic action of cathepsin E, the major proteolytic enzyme in mature erythrocytes. Proteolytic enzymes released from erythrocytes may mask or potentially enhance epitope recognition in various immunoassays. Interference caused by proteolytic activity may cause measurement bias of various degrees and various directions, depending on the assay. For example, current cTn assays have variable susceptibility to hemolysis interference.
Hemolysis has been shown to cause negative interference with concentrations of cTnT, insulin, cortisol, testosterone, and vitamin B12, and false-positive increases for prostate-specific antigen (PSA) and cTnI in a concentration-dependent manner. However, the degree and direction of bias are analyte and method dependent. For example, hemolysis causes falsely decreased concentrations of cTnT assayed with the Roche hs cTnT assay on the Elecsys E170 immunochemistry analyzer, whereas concentrations of cTnI measured using the Ortho Clinical Diagnostics TnI ES assay on the Vitros 5600 Integrated System (Ortho Clinical Diagnostics, Rochester, NY) are falsely increased in hemolyzed samples. Abbott Architect TnI assay appears to be more robust against interference from hemolysis. The microparticle enzyme immunoassay for cTnI (Abbott Laboratories, Abbott Park, IL) is not affected by moderate hemolysis and exerts clinically relevant bias only for grossly hemolyzed samples.
Lipemia is defined as a turbidity of the sample visible to the naked eye. Turbidity of the sample is caused by the light scattering due to the presence of large lipoprotein particles (chylomicrons). The increase in concentration of lipoproteins in blood most commonly occurs due to postprandial triglyceride increase, parenteral lipid infusions, or some lipid disorders. Not all lipoproteins have equal contribution to the sample turbidity. The effect of lipoprotein particles on the sample turbidity depends on the size of the particles. Chylomicrons and very low-density lipoproteins (VLDL), the largest lipoprotein particles in the circulation, have the greatest contribution to the sample turbidity. To avoid postprandial lipemia, patients are therefore requested to fast for 12 hours before the blood sampling.
Lipemia is an important endogenous interference that may cause clinically relevant bias of patient results through the several mechanisms described below.
Lipemia causes interference by light absorbance and light scattering. The lipemic sample absorbs light, causing a decrease in the intensity of the light passing through the sample. The ability of lipoprotein particles to absorb light is manifested in the range of wavelengths (300 to 700 nm). Sample absorbance rises with the decreasing wavelengths and is maximal in the ultraviolet range. That is why many enzymatic methods in which the end product is measured at 340 nm (NAD[P] or NADP[H]) are strongly affected by lipemia.
Lipemic samples also cause light scattering. Light scattering occurs in all directions, and its intensity depends on the number and size of lipoprotein particles and the wavelength of measurement. For this reason, light scattering of lipoprotein particles causes significant interference with turbidimetry and nephelometry. In methods where the transmittance of light is inversely proportional to the concentration of the analyte, in the absence of the sample blank, sample turbidity causes positive bias. However, in some competitive assays where the transmittance of light is directly proportional to the concentration of the analyte, sample turbidity will cause negative bias.
Plasma in healthy individuals in the fasting state consists of only minor portion of lipids (<10% of the total plasma volume). The rest of the plasma is water. The increase in the concentration of lipoprotein particles leads to an increase in the plasma volume occupied by lipids. Particles that are not lipid soluble are displaced by the lipids to the water part of the plasma. Therefore lipemia leads to a false decrease in the concentration of the measured analyte in all methods in which the concentration of respective analyte is measured in the total plasma volume.
One example of interference caused by the volume depletion effect is the bias in electrolyte measurement, leading to so-called pseudo-hyponatremia. This type of interference affects electrolytes only if measured by flame photometry and by indirect measurement using ion-selective electrodes (ISEs) but not in direct potentiometry (for more details, see Chapters 17 and 37 ). However, it must be noted that the volume displacement effect of the lipemic sample will affect the electrolyte measurement only in grossly lipemic samples with concentrations of triglycerides greater than 17 mmol/L (1504 mg/dL).
Upon centrifugation of a lipemic sample, lipoproteins are not homogeneously distributed in the serum or plasma due to the lipid gradient ( Fig. 5.3 ). Water-soluble analytes are more concentrated in the lower layer of the plasma or serum, whereas lipids and lipid-soluble analytes, such as drugs and some lipid-soluble hormones, are more concentrated in the top lipid-rich layer. This is especially important in automated chemistry analyzers with fixed path lengths of the sample probe. Test results may differ for those analytes that are not evenly distributed between the lipid and water portion of the sample, depending on the part of the sample from which the sample probe is taking the sample for analysis.
An excess of lipoproteins in the blood may interfere in electrophoretic and chromatographic methods by causing abnormal peaks. Increased concentrations of triglycerides and lipoprotein particles may disturb the electrophoretic pattern and morphology, as well as falsely increase the relative percentage of the prealbumin, albumin, and α1- and α2-globulin regions. , Moreover, lipemia may even affect some immunochemistry assays by masking the binding sites on antigens and antibodies and thus physically interfering with antigen–antibody binding.
One additional complication of excessive lipemia is the increased sample susceptibility to hemolysis leading to the specific turbid and reddish appearance of the sample (the so-called “strawberry milk” appearance). This effect is most probably caused by the increased fragility of the erythrocyte membranes due to the alterations in the content of the phospholipid membrane layer and is more pronounced with the increase in lipid (particularly triglycerides) concentrations.
In the hospital environment, lipemic samples are not infrequent. They most often originate from emergency departments, intensive care units, and endocrinology and gastrointestinal clinics from patients suffering from conditions that include acute pancreatitis, acute or chronic kidney failure, thyroid or lipid disorders, and diabetes mellitus. Lipemic samples quite commonly require immediate results. Unlike hemolysis, the interference caused by lipemia can be fully eliminated, or at least reduced, by removing the excess of lipids from the sample. Still, even if lipids have been successfully removed from the sample, any visible turbidity of a sample should be documented and reported with the test results because it offers clinically useful information about the patient. Moreover, lipid testing and testing for lipid-soluble drugs (e.g., benzodiazepines) and hormones (e.g., thyroid hormones) should always be done on the native sample, before delipidation. Methods for lipid removal include ultracentrifugation, high-speed centrifugation, and some lipid-clearing agents.
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here