Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Since the initial descriptions of successful deceased donor liver transplantation (LT) in the 1960s by Dr. Thomas Starzl (see Chapter 125 ), there has been tremendous change and growth in the field of transplantation. Advancements in organ preservation, immunosuppression, perioperative management, and refinements in surgical technique have contributed to an increased number of patients undergoing LT and growing indications for the procedure (see Chapter 105 ). For many patients with end-stage liver disease (ESLD), LT represents the only viable treatment modality. LT is an excellent option as curative therapy for early-stage hepatocellular carcinoma (HCC; see Chapter 108A ) and in select cases of cholangiocarcinoma (CCA; see Chapter 108B ), with good long-term outcomes. The obesity epidemic has led to a rapidly increasing number of patients with nonalcoholic steatohepatitis (NASH) listed for transplant. Direct-acting antiviral therapy has altered the outcome of LT for hepatitis C virus (HCV) and allowed for the use of organ donors who are HCV antibody-positive. , Overall, patient survival after LT continues to improve and currently is 93% at 1 year, 87% at 3 years, and 80% at 5 years.
Despite remarkable advancements in the last 50 years in the field of LT, limitations persist. Notably, there is a relatively fixed pool of cadaveric organ donors with a growing number of new waiting list registrations. Techniques implementing the use of donation after cardiac death, living-donor liver transplantation (LDLT), and ex-vivo normothermic liver perfusion may extend the benefit of LT to more patients awaiting transplantation.
This chapter presents a broad overview of LT, including common criteria for recipient and donor selection (see Chapter 105 ), common postsurgical complications (see Chapter 111 ), and outcomes related to LT and the underlying etiology of ESLD. Specialized techniques including LT and hepatectomy in LDLT are addressed in Chapters 121 , 125 , and 128 .
For many patients with irreversible acute and chronic liver disease and cirrhosis, orthotopic LT (OLT) represents the only curative treatment option, regardless of etiology (see Chapters 105 and 107 ). Despite increasing acuity, 5-year patient survival after LT has increased from less than 50% to greater than 80% over the last 50 years. The improvement in patient survival has contributed to the expanded indications for LT and concomitant increase in the number of patients referred and listed for transplantation (see Chapter 105 ). The consequence of this success is the persistent disparity between the number of potential recipients awaiting LT and the number of available donor organs.
In 2019, 8896 LTs were performed in the United States, the greatest number of LTs performed in a single year, and a 41% increase from 10 years ago. LDLT increased 31% compared with 2018, accounting for 524 LTs. However, there is also continued growth in the number of waiting list registrations, with 12,767 new registrations in 2019 ( Fig. 109.1 ). The donor-to-recipient disparity leads to longer waiting times and worsening medical status. The severity of liver disease at time of listing has increased as measured by a higher first active Model for End-Stage Liver Disease (MELD) score , (see Chapter 4 ). The proportion of patients receiving LTs with MELD scores of 35 or higher has doubled in the last decade, representing greater than 10% of LT recipients. , Despite this, there has been a decline in peak pre-transplant mortality rate over the last 5 years from 17.9 per 100 waiting list years in 2014, to 12.4 per 100 waiting list years in 2019 because of improvements in perioperative care. The most dramatic improvement in pre-transplant mortality was observed among patients with HCV, reflecting the efficacy of direct-acting antiviral therapy.
The process by which potential recipients are listed for LT has undergone extensive revisions to optimize equitable and just allocation of a scarce resource. Before 2002, potential recipients were prioritized based on the Child-Turcotte-Pugh (CTP) scoring system ( Table 109.1 ), time on the waiting list, and patient location (intensive care unit [ICU], hospitalized, ambulatory). Waiting lists grew under this system, and it became evident that these parameters were inadequate measures of disease severity. In 1999 the Institute of Medicine proposed that a continuous disease severity score based on medical urgency over waiting time could improve the allocation of cadaveric livers for LT. This resulted in the adoption of the MELD criteria to assess necessity for LT and determine waiting list priority ( Table 109.2 ). As shown in Figure 109.2 , the use of MELD criteria for predicting mortality from ESLD results in an appropriate correlation between severity score and actuarial survival at 3 months. Survival benefit analysis of LT recipients stratified by MELD score demonstrates that the risks involved in transplantation are equivalent or less than the risks associated with remaining a transplant candidate on the waiting list with a MELD score greater than 15 (see Chapter 4 ).
POINTS | |||
---|---|---|---|
1 | 2 | 3 | |
Encephalopathy | None | 1 or 2 | 3 or 4 |
Ascites | Absent | Slight | Moderate |
Bilirubin (mg/dL) | 1–2 | 2–3 | >3 |
Albumin | >3.5 | 2.8–3.5 | <2.8 |
Prothrombin time (seconds prolonged) | 1–4 | 4–6 | >6 |
SCORE | NO. PATIENTS | MORTALITY RATE (%) | DEATH OR REMOVAL FROM LIST BECAUSE OF ILLNESS (%) |
---|---|---|---|
<9 | 124 | 1.9 | 2.9 |
10–19 | 1800 | 6 | 7.7 |
20–29 | 1098 | 19.6 | 23.5 |
30–39 | 295 | 52.6 | 60.2 |
≥40 | 120 | 71.3 | 79.3 |
a R = (0.957 × Log e [creatinine mg/dL] + 0.378 × Log e [total bilirubin mg/dL] + 1.120 × Log e [INR] + 0.643) × 10.
Special considerations, or exceptions, for subsets of patients with liver disease are ongoing. At the time the MELD score was adopted, patients with HCC and early cirrhosis were prioritized on the waiting list, when they would potentially benefit most from LT (see Chapter 108A ) and before progression of tumor burden, which would eliminate LT as a therapeutic option. Guidelines exist for awarding exception points for other liver disease–related conditions whose severity and associated risk of mortality are not captured by the calculated MELD score, including hilar CCA, hepatopulmonary syndrome, portopulmonary hypertension, cystic fibrosis, hepatic artery thrombosis after LT, pediatric hepatoblastoma, inborn errors of metabolism, familial amyloidosis, and primary hyperoxaluria (see Chapters 51 , 77 , 78 , 105 , and 110 ). However, the process is highly variable by region.
In the last decade, disparities in deceased-donor transplant rates because of geographic location continued to grow within the United States, leading to the implementation of new liver allocation policy and establishment of the National Liver Review Board to optimize outcomes for liver candidates while simultaneously maintaining good organ stewardship. Based on donor-specific-areas throughout the United States, the median MELD at time of transplant varied significantly. The greatest median MELD at time of transplant was reported in New York at 39 and the lowest in Arkansas at 19. , To address these disparities, the Organ Procurement and Transplantation Network (OPTN) implemented a new distribution system that emphasizes the medical urgency of LT candidates and the distance between donor and recipient hospitals. This went into effect in February 2020 and while statistical modeling project lower waitlist mortality, its effects on LT remain to be determined. Nonetheless, because of the limited supply of donor organs, appropriate recipient and donor selection is critical to improve resource utilization and long-term outcomes.
Common indications for LT (see Chapter 105 ) include portal hypertension (as manifested by variceal bleeding), ascites, encephalopathy, hyperbilirubinemia, hepatic synthetic dysfunction, and lifestyle limitations. Alcohol-related liver disease and NASH continue to rise, representing the two most common diagnoses of ESLD requiring LT. Malignancy, in particular HCC, is a growing indication for LT, accounting for 6.7% of LT in 2002 and 14% in 2019 , (see Chapter 108 ). Five years ago, viral hepatitis was the most common etiology of liver failure resulting in LT, accounting for 25% of transplants (see Chapter 68 ). However, with the use of direct-acting antiviral agents, the proportion of LTs performed for HCV has declined dramatically and now only accounts for 7.3% of transplants as seen in Figure 109.3 . Biliary atresia is the most common indication for LT in patients younger than 18 years (see Chapter 110 ). The rising incidence of NASH coincides with an increase in obesity (body mass index [BMI] ≥ 30) and prevalence of diabetes among transplant recipients over the last decade. , It is anticipated that the obesity epidemic will present new challenges to the transplant community with a rise in NASH and NASH-related HCC, exacerbating the demand for LT while simultaneously reducing the availability of suitable organ donors with non-steatotic livers , , (see Chapter 69 ).
Few true, absolute contraindications to LT exist that uniformly portend a poor patient outcome ( Box 109.1 ). Advanced cardiopulmonary disease, known extrahepatic malignancy not meeting oncologic criteria for cure, uncontrolled systemic sepsis from a source originating outside the liver, acquired immunodeficiency syndrome (AIDS), and ongoing or recent substance abuse are absolute contraindications. , Many relative contraindications are conditions that are expected to improve after successful LT. Examples include severe hemodynamic instability requiring multiple pharmacologic agents to maintain perfusion, extreme pulmonary hypertension, or severe hypoxia uncorrected by conventional intensive care measures in the context of hepatopulmonary syndrome. Other relative contraindications to LT are extensive mesenteric venous thrombosis, morbid obesity, psychiatric disorders uncontrolled by conventional means, absence of a suitable social support network, and extremes of age , (see Chapter 105 ).
Advanced cardiopulmonary disease
Extrahepatic malignancy
Uncontrolled sepsis
Active substance abuse
ABO incompatibility
Hemodynamic instability
Severe hypoxia (except with hepatopulmonary syndrome)
Human immunodeficiency virus infection
Refractory psychiatric disorders
Absence of adequate social support
Advancements in surgical technique and medical supportive care have expanded LT candidacy to subsets of recipients and conditions formerly considered to be absolute contraindications, including portal vein (PV) thrombosis (PVT), human immunodeficiency virus (HIV) infection, and advanced age. Although obesity is not a contraindication to LT, several authors have shown that BMI greater than 40 is associated with significantly reduced 5-year patient and graft survival. , , Further, severe morbid obesity is associated with increased length of stay (LOS), risk of infectious complications, and post-transplantation malignancy. These patients pose long-term challenges in dosing immunosuppressive medications affected by appropriate dosing weight. Given these consequences, many centers have incorporated interventions to combat obesity in transplant recipients including nutritional education, exercise regimens, and bariatric surgery. , , ,
Transplantation for HIV is an example of a relative contraindication that is site-specific. With the advent of highly active antiretroviral therapy (HAART), HIV has become a chronic condition such that patients may begin to suffer the morbidity and mortality of other diseases, including ESLD. Patients with HIV may be considered for LT if their CD4 T-cell count is greater than 200 and their HIV RNA viral load is less than 50 copies/μL within 12 months before LT. Additionally, patients with HIV have the option of receiving an organ from an HIV-infected organ, according to the HIV Organ Policy Equity Act, enacted in 2013.
The average age of LT recipients continues to rise. Nearly 70% of LT are performed for patients older than 50 years, and 20% were older than 65. With an aging population, thorough screening for medical comorbidities typically found in older populations, such as lifestyle-limiting cardiopulmonary disease, systemic vascular disease, and chronic renal insufficiency, is crucial to successful LT. Although septuagenarians have lower 5-year patient survival (70.8%) compared with those less than 60 (80.7%), survival of LT recipients, regardless of age, surpasses those denied LT. When appropriately selected, older patients with ESLD can benefit from LT. ,
With fewer contraindications to LT and a growing list of potential recipients, there is a well-established demand for donors. Optimizing donor selection and management is critical to reducing this disparity. An optimal deceased donor is generally considered an otherwise healthy, hemodynamically stable, young individual who sustained an irreversible cerebral insult resulting in brain death. Restricting use to these “ideal donors” fails to meet the needs of an expanding waiting list. With increasing demand, transplant centers are turning to marginal liver allografts from extended-criteria donors (ECDs) and donation after cardiac death (DCD). , The acceptance of these more marginal allografts are associated with higher risks of primary nonfunction, early graft dysfunction, biliary complications, and decreased long-term graft survival. , , Thus the surgeon must make an assessment to determine suitability of a particular donor for a specific recipient.
Numerous donor and recipient characteristics associated with graft and recipient outcomes have been evaluated, but comparisons can be difficult given variable clinical factors and severity of underlying liver disease. When considering donors, special consideration must be given to recipient selection because a patient with higher MELD or increased comorbidities may not tolerate a period of slow graft function or ischemia reperfusion to the same extent as a healthier, low MELD recipient. In 2006, a donor risk index (DRI) was derived to help quantify and stratify allografts by predicted failure risk. Risk factors associated with graft loss include increased donor age, DCD, use of split grafts, Black donors, shorter donors, death because of cerebrovascular accident (CVA), and causes of brain death other than trauma or anoxia. Experience with the DRI demonstrated use of high-risk donor livers is associated with increased relative risk of allograft failure and development of complications including hepatic artery thrombosis (HAT), biliary complications, and survival in selected subsets. , However, there are limitations of the DRI statistical modeling that cannot capture the patient’s overall health status and frailty. ,
With an aging population, there has been a growing need to consider organs from elderly donors. Carefully selected older donor allografts (including septuagenarians and octogenarians) have been used with good success. It is important to note the presence of atherosclerosis, steatosis, or fibrosis that may affect the recipient. Limiting cold ischemia time and degree of steatosis are hypothesized to be important in optimizing the results of transplantation from older donors. Further, accepting an older donor is associated with lower 5-year mortality as compared with those who declined the same donor and waited for another organ offer (23% vs. 41%). Therefore age alone should not exclude a potential donor when matched to an appropriate recipient.
The obesity epidemic not only challenges the transplant community with a rapidly growing number of potential recipients with NASH cirrhosis but also has the potential to limit donors with nonsteatotic livers, further exacerbating the disparity in organ donation. When encountering a graft with a fatty appearance, biopsy and the histologic determination of fat content may help guide use of steatotic livers. Histologically, microvesicular steatosis is characterized by numerous small lipid droplets in the cytoplasm without any alteration in the position of the nucleus. Macrovesicular steatosis describes the presence of large lipid droplets within hepatocyte cytoplasm that displaces the nucleus. Microvesicular steatosis has minimal effect on allograft function. However, allografts with increased macrovesicular steatosis are characterized by poor microcirculation, depleted ATP-energy stores because of impaired capacity for mitochondrial recovery, and an increased inflammatory response after more severe ischemia-reperfusion injury. , Overall, this results in impaired recovery and regenerative capacity and poorer graft function. Moderate (30%–60%) and severe (>60%) macrovesicular steatosis is associated with early graft dysfunction and primary nonfunction with rates as high as 35% and 15%, respectively. When carefully selected, liver allografts with up to 40% macrosteatosis may be used in select groups of recipients with similar 5-year graft survival as compared with livers with mild macrosteatosis. , Although patient survival is not affected, grafts with moderate steatosis have been associated with increased resource use including increased transfusions, longer hospital stays, and prolonged ICU course. Among these donors, it is necessary to minimize other risk factors, such as cold ischemia time, to improve recipient and graft outcomes (see Chapter 105 ).
The opioid epidemic has increased the availability of organ donors, albeit with increased risk to the recipient of potential hepatitis C transmission. Historically, use of HCV+ donors has raised concern that transplantation would result in aggressive recurrent disease in the recipient. The introduction of novel oral-direct acting antiviral (DAA) agents for HCV continues to alter the field of LT. , , Several studies have demonstrated that use of HCV+ donors into HCV+ recipients is safe, with long-term outcomes comparable to HCV– allografts, in the absence of severe inflammation or fibrosis. The use of these organs is not associated with inferior graft or patient survival. The proportion of HCV+ recipients receiving HCV+ livers has increased from 6.9% in 2009 to 16.9% in 2015 and the rate of discard of HCV+ livers continues to decline. , Nucleic acid testing (NAT) has expanded the donor pool as well. There is growing literature in the use of HCV Ab+/NAT- donors into HCV seronegative recipients. Development of HCV viremia in the recipient is treatable with standard of care direct acting antivirals with no difference in graft loss or mortality. From 2017 to 2019, 9.0% of LTs were from donors with HCV Ab+. Among LT recipients without a history of HCV, 4.2% received a liver from an HCV Ab+ donor. For the first time, the percentage of recipients willing to accept an HCV+ organ has surpassed those who declined HCV+ organs, attributed to successful medical advancements in antiviral therapy.
The use of donors with serologic markers for past HBV infection have potential for use and expansion of the donor pool. Grafts from hepatitis B core antibody positive donors may be offered to potential recipients using protocols that incorporate immunoglobulin therapy and antiviral therapy for both HBV positive and negative recipients with comparable survival.
Another source of allografts is from DCD, formerly referred to as non-heart-beating donation. These organs are from donors who have sustained an irreversible, catastrophic illness without hope for meaningful recovery but do not meet criteria for brain death. In the DCD setting, life support is withdrawn, and the donor is observed until time of death, which is declared by a nontransplant physician. From cessation of circulation, an additional 2- to 5-minute waiting period is mandated before organ retrieval is initiated. It is important to note that 10% of potential donors do not die within 2 hours of withdrawal of support; these patients are not candidates for subsequent organ donation and are transferred back to the ICU and allowed to expire. Circulation is unlikely to resume after 2 minutes of complete cessation; a minimum waiting period of 2 minutes is required, and a 5-minute interval between cessation of circulation and declaration of death before organ retrieval is strongly encouraged. Unlike DBD donors, allografts from DCD donors have increased warm ischemia time that begins with withdrawal of life support, includes the progressive hypoxia and hypoperfusion, until declaration of death and initiation of cold preservation. This prolonged warm ischemia time increases the risk for delayed graft function, primary nonfunction, and biliary complications related to ischemic cholangiopathy. , ,
After the concept of brain death became widely accepted in 1968 with the Harvard Neurologic Definition and Criteria for Death, use of DCD organs fell out of favor. DCD organ transplantation was reintroduced in the 1990s with improvement in preservation and procurement techniques. The percent of deceased donors liver transplants in the United States that are DCD has grown form 1% in 1996 to 8.5% in 2019. In 2000, Reich et al. published the first successful series of controlled DCD LT with excellent outcomes including 100% patient and graft survival and no arterial thrombosis or biliary complication during an 18 month follow-up period. Since then, numerous larger studies have published their experiences. In 2003 the University of Pennsylvania was the first to report a higher incidence of biliary complications, in particular ischemic cholangiopathy, because of sensitive biliary epithelium that are susceptible to prolonged warm ischemia time and more severe ischemia reperfusion injury. Ischemic cholangiopathy is often associated with the need for hospital readmissions, prolonged antibiotic use, endoscopic and percutaneous biliary drainage procedures, and abscess formation. , Additionally, it may progress to graft loss, need for re-transplantation, or even death. , ,
Another major complication of DCD recipients is an increase in renal insufficiency. Despite normal preoperative creatinine and glomerular filtration rate (GFR), recipients of DCD organs more frequently develop acute kidney injury (AKI; 14.7% vs. DBD 7.3%) and need for dialysis (up to 40% of DCD recipients). , This is thought to be related to the increased reperfusion injury associated with DCD allografts.
Consequently, there has been interest in identifying suitable DCD donors and recipients to minimize associated morbidity and mortality. Use of standardized DCD protocols with strict selection criteria has led to excellent results, comparable to DBD donors at select institutions. , , Donor age less than 50 years, weight less than 100 kg, warm ischemia times less than 30 minutes, and cold ischemia times less than 10 hours are associated with improved graft and patient survival. , , Given that older donors are more susceptible to biliary ischemia, some suggest against use of DCD donors more than 50 years old. Recipient risk factors such as older age, re-transplantation, serum creatinine greater than 2.0 mg/dL and life support before LT are associated with inferior patient and graft outcomes. In general, donor, recipient, and surgical variables that minimize ischemia time and its associated affects are associated with more favorable outcomes because the donor graft is protected against additional ischemia. , Mayo Clinic established a relationship that with each minute increase in declaration of death to cross clamp, there is an associated 16.1% increased odds of ischemic cholangiopathy. Similarly, prolonged cold ischemia times are associated with increased risk of graft failure such that each hour increase in cold ischemia time was associated with 6% higher graft failure rate. Cold ischemia time between 6 and 10 hours was associated with a 64% higher graft failure risk compared with cold ischemia time less than 6 hours. Given the associated morbidity and mortality with ischemic cholangiopathy, protocols to minimize biliary complications have been implemented at some institutions. Tissue plasminogen activator (TPA) injection into donor hepatic artery may lower the risk of biliary complications by minimizing thrombus formation in the peribiliary microcirculation. Flushing of the biliary system and using less viscous preservation solution may provide some benefit as well.
DCD donors provide an additional option for donation when DBD is not feasible. Acceptance of these organs is not without risk, but careful technique, standardized protocols, and patient and recipient selection may help to mitigate associated complications. An area of ongoing interest and future development is the use of normothermic ex vivo machine perfusion. This has the potential of reducing the ischemia-reperfusion injury and associated complications identified in the setting of DCD liver transplantation. Altogether, further investigation and optimization potential donors and recipients may help to expand the donor pool.
An alternative to cadaveric donation is the use of living donor allografts. The first adult-to-adult LDLT in the United States was reported in 1998, and use has fluctuated since then. Initially, numbers of LDLT rose with a peak in 2001, representing approximately 10% of all LTs, but they currently represent 5.3% of LTs performed. LDLT is appealing because it allows for a more elective transplantation after optimizing the health of the recipient, reduces cold-ischemic times, and potentially expedites waiting times. However, the risk of morbidity and mortality to an otherwise healthy donor is notable. Donor hepatectomy remains a technically demanding surgical procedure with higher complication rates than seen in living kidney donation (see Chapter 121 ). Refinements in technique are critical to optimizing donor safety, which remains of utmost concern. Despite inherent risks to the donor, right hepatic lobe LDLT has become an important option in the management of liver disease. Details of the donor operation, open and minimally invasive, are described in in Chapters 121 and 128 .
In 2002, the Adult-to-Adult Living Donor Liver Transplantation (A2ALL) consortium was developed and funded by the National Institute of Health. It consisted of nine transplant institutions experienced in performing LDLT and sought to provide accurate information on outcomes for both donors and recipients. The results of this can help guide and educate surgeons and patients when considering LDLT versus deceased-donor LT (DDLT) because one must consider the potential risk to an otherwise healthy donor versus benefit to a recipient and likelihood of receiving a DDLT before disease progression.
Despite initial reports suggesting worse outcomes in LDLT, a trend has been seen toward improved graft and overall patient survival in the United States in the last 10 years. This is particularly evident when compared with waiting for a deceased donor or remaining on the list without a transplant. The reduction in mortality is greatest in centers with increased experience with LDLT. A learning curve has been described with increased graft and patient loss with the first 15 to 20 cases at a single institution. When considering appropriate candidates, one must take into account the unique components of LDLT. Compared with DDLT, LDLT recipients have a slightly higher complication rate and receive a smaller graft. This may not be acceptable for decompensated patients with more advanced liver disease.
In the setting of acute liver failure (ALF), recipients typically have sufficiently high MELDs such that deceased donor livers are available for transplantation because these recipients are of highest allocation priority (see Chapter 107 ). Additionally, potential living donors may struggle to complete a thorough evaluation and consideration of donation without unnecessary risks in the short interval. As a result, LDLT is rarely performed for ALF in the United States to avoid unnecessary risks to a donor. Nonetheless, the A2ALL group demonstrates that LDLT in this setting is an acceptable option with comparable survival to DDLT at 5-year follow-up if criteria for donation are met. Among recipients of LDLT in the setting of ALF, there were no observed deaths while waiting for transplantation.
Recipient complications and re-transplantation rates are higher in the setting of LDLT compared with DDLT. LDLT recipients have a higher rate of technical complications, including biliary leaks, biliary strictures, HAT, and PVT. In contrast, complications related to graft issues or ischemia reperfusion are more common in DDLT recipients. Ascites, pulmonary edema, intra-abdominal bleeding, and cardiac complications occur more frequently in DDLT. Once a complication occurs, time to resolution is comparable between LDLT and DDLT (see Chapter 111 ).
With regards to long-term outcomes, the A2ALL consortium showed 10-year survival was 70% for LDLT and 64% for DDLT based on 1427 liver recipients (with 964 LDLT). For all recipients, a diagnosis of PSC was associated with improved survival, whereas dialysis and older recipient/donor age were associated with worse survival. A higher MELD was also associated with increased graft failure. Recipient diagnosis, age, disease severity, and presence of renal failure should all therefore be taken into account when considering appropriateness for LDLT. Overall, LDLT has been shown to improve survival in patients with lower MELD scores, decreases waitlist mortality, and has good long-term outcomes. ,
Living donation is unique in that an otherwise healthy donor undergoes surgery; therefore minimizing morbidity and mortality of the donor is critical (see Chapter 121 ). Donor evaluation is comprehensive and includes a donor advocate, history and physical, cardiac clearance, psychosocial evaluation, and delineation of donor anatomy. Details of this complete evaluation are published elsewhere. , During the donor evaluation, graft size is an important consideration. Small-for-size syndrome, or early allograft dysfunction, is characterized by the presence of jaundice or coagulopathy without technical complications. This is most frequently associated with grafts smaller than 0.8% graft weight to recipient body weight or less than 40% of standard liver volume. , This is reported to occur in up to 19% of LDLT grafts. Risk factors include use of left lobe grafts, small size, high preoperative bilirubin, high portal perfusion pressure, increased donor age, and increased donor BMI. Recipients who develop small-for-size syndrome are at increased risk for graft loss within 90 days.
Meticulous surgical technique and completion of appropriate donor evaluation are critical to ensure donor safety and minimize morbidity and mortality. Nonetheless, up to 40% of living liver donors experience a complication. A cohort of 760 living donors enrolled in the A2ALL consortium provides a comprehensive assessment of donor complications. Commonly reported complications include bacterial infection (12.5%), biliary infection (9.7%), incisional hernia (5.6%), pleural effusion (5.3%), psychological distress (4.1%), and aborted hepatectomy (2.7%). Although mortality is low, it is not nonexistent, with a reported rate of 0.4%. These results should be provided to potential donor candidates as part of their comprehensive evaluation and informed consent process. As the role of LDLT continues to evolve, continued efforts to optimize donor and recipient selection and operative technique will be critical for program and patient success.
Advancements in organ preservation techniques is an ongoing area of research to expand options for organ donation. Static cold storage (SCS) at 4°C is the standard method of preservation of liver allografts. Although metabolic activity is reduced, anaerobic metabolism continues, leading to adenosine triphosphate (ATP) depletion with accumulation of lactic acid, cell swelling, and cell death. If prolonged, SCS can exacerbate ischemia-reperfusion injury and intensify complications such as primary nonfunction, early allograft dysfunction, biliary strictures, and ischemic cholangiopathy. Alternate methods of preservation have been successful in kidney transplantation, with improved graft and patient outcomes. In 2009, Guarrera et al. reported the first human trial using machine perfusion in liver allografts. Machine perfusion allows for an influx of nutrients and oxygen while flushing pro-inflammatory cytokines, reducing ATP depletion, and mitigating the effects of ischemia reperfusion injury associated with SCS. Placement of a liver on machine perfusion and subsequent monitoring can be seen in Figures 109.4 and 109.5 . Initial reports demonstrated lower serum injury markers and mean hospital LOS, fewer biliary complications, and reduced rates of graft dysfunction.
Since the initial report demonstrating safety and feasibility, others have sought to use machine perfusion with marginal or extended-criteria donor livers. In these early studies, machine perfusion has shown protection of early and late allograft injury in DCD donors, fewer biliary complications, and reduced hospital LOS. Machine perfusion allows for assessment of allograft viability using markers of bile production, biliary chemistry, and lactic acid levels. , Aspartate transaminase (AST) and alanine aminotransferase (ALT) are cytoplasmic enzymes used as markers of hepatocyte injury present in the effluent perfusate, and, notably, early studies have demonstrated a reduction of AST and ALT, suggesting improved graft function. Hepatic arterial and PV pressure and resistance measure may provide early measures of graft viability and function as well.
In a randomized trial of 220 LTs, machine perfusion was associated with 50% lower level of graft injury. Importantly, these results were achieved with improved organ utilization (e.g., lower discard rates) and longer preservation time when compared with SCS. With growing use of marginal donors, increased prevalence of steatotic livers poses a challenge to transplantation. There is a need to identify methods to optimize these donors so that steatotic organs may be suitable for transplantation. In 2017, of the 4346 steatotic livers recovered, 50.6% were discarded. There is some suggestion that machine perfusion with selected perfusates may be able to enhance liver function of steatotic livers and increase intracellular lipid metabolism and reduce macrovesicular steatosis. , Continued advancements in machine perfusion may help reduce waiting list mortality and continue to improve patient and graft outcomes.
The first published description of human LT was by Starzl and colleagues in 1963 at the University of Colorado (see Chapter 125 ). In this seminal paper, the dismal outcomes of three LT recipients were described, including one intraoperative death from uncorrectable coagulopathy and two survivors at 7 and 22 days. In addition to the pioneering conceptual framework and implementation of LT, the advanced techniques included grafts from non–heart-beating donors, venovenous bypass in the recipients, choledochocholedochostomy, and coagulation monitoring via thromboelastography (TEG). Many of these concepts remain or have re-entered the realm of LT more than 50 years after their initial description. Details of the operative procedure, including DDLT and living related-LT are discussed in Chapters 121 , 125 , and 128 .
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here