Transplant, solid organ


Abstract

Background

The development of transplantation for organ failure constituted a significant advance in the management of humans with solid organ failure, which progressed rapidly over the second half of the 20th century. The development of successful outcomes in different disciplines to achieve improved clinical and patient-centered outcomes has been a testament to dedicated clinicians and scientists worldwide. The changes in immunosuppression have improved outcomes but have also brought with them complications. Improving longer-term outcomes of transplant recipients continues to be a significant hurdle throughout all solid organ transplantation, and interventions at each point in the transplant management process are being investigated.

Content

This chapter describes the background to solid organ transplantation and the critical immunologic steps to achieve significant outcomes at the time of the transplant and in the ongoing management. The different immunosuppression currently in use is described with associated benefits and side effects with the outcomes of different organ transplants. The complications associated with transplantation, monitoring organ function, future direction, and nontransplant management of organ failure are discussed.

Introduction

Organ transplantation had been a long experimental goal of many physicians and scientists, with many reported small advances until the first successful kidney transplant took place in 1954 in Boston, MA. , It was successful because the recipient and donor were identical twins. Previous attempts at renal transplantation, although surgically successful, had failed within the first fortnight due to graft rejection.

The current main barrier to organ transplantation is the lack of adequate organs available to potential recipients. The development of Donation after Cardiac Death (DCD) criteria has provided an additional source of organs for transplantation in the United States. The application of DCD is growing, but there are still potential developments to improve outcomes. While donation after brain death (DBD) kidney donation rate is static, a previous increase in living kidney donation started to improve transplantation rates, but more recently, it has been the increase in the DCD in all solid organ transplantation that has begun to bridge the gap between supply and demand. , Living donor kidney transplantation is beneficial in many ways as it significantly improves clinical outcomes for allograft survival, allows timing of the transplant with improved health of the donor kidney, avoids brain stem death and perioperative trauma, and offers longer ischemic time.

Liver transplant, similar to kidney experimental models with many canine models, was first attempted in humans by Dr. Starzl in 1963. One year survival post-transplant was achieved in 1967 again by Dr. Starzl —but survival at 1 year remained around 25% until the introduction of cyclosporine in the 1980s when it became a recognized modality of treatment for liver failure within selection criteria.

Heart transplantation was successfully performed first in South Africa in 1967 by Dr. Barnard following significant experimentation from Demikhov, who performed canine intrathoracic auxiliary heart transplants in the 1950s in Russia. In 1964, Dr. Hardy performed the first cardiac transplantation into a human, a xenotransplant (crossing species) using a large chimpanzee heart as the donor organ; however, this was not successful with the death of the recipient 1 hour later. Lung transplants spread into wider clinical practice in the early 1980s; however, both recipient and donor matching remain significant factors in its success. Despite early success in 1963, the lung recipient died within 3 weeks, and the early clinical experience had issues of inadequate immunosuppression but also problems with the bronchial anastomosis. This latter complication was associated with higher doses of steroids, and with the introduction of cyclosporine later, the complication rate reduced and clinical success increased. The Toronto group pioneered the progression from single lung to bilateral lung transplant in 1987 and 1988, respectively, , and more recently, ex vivo perfusion.

Currently, transplantation in different organs remains dependent on organ availability, with more than 110,000 patients waiting on the organ transplant list in the United States in 2020. Most candidates waiting for an organ transplant are waiting for a kidney. In the United States in 2019, there were 23,401 kidney transplants, 8896 liver transplants, 3552 heart transplants, 2714 lung transplants, 872 combined pancreas and kidney transplants, and 143 pancreas alone transplants. Despite significant success over the past 70 years in clinical transplantation, there are still significant advances that can be made. While the technical abilities have improved, the lack of available donors, the clinical impact of immunosuppressive drugs on patient’s lives in terms of side effects and quality of life, the prevalence of chronic rejection as a leading cause of allograft failure, and the increased need for re-transplantation are still problems to be solved by the transplant community.

Immunobiology of allotransplantation and barriers

The immunologic success of the first kidney transplant due to the human leukocyte antigen (HLA) identical nature of donor and recipient represents the importance of both the donor’s and the recipient’s immune systems in transplantation success. The recipient died from cardiac complications 8 years later, demonstrating the impact of organ failure on systemic health. Assessment of compatibility in the early days of transplantation was based on clarification of identical twin status, including blood group typing, fingerprint analysis, and skin grafting from donor to recipient.

Antigen—endothelial interaction

The interaction between the transplanted allograft and the recipient immune system remains the ongoing challenge in clinical transplantation. The use of nonhuman organs in humans with immediate failure demonstrated the immunologic risk of this procedure. HLA, Gal, and other antigens were not well understood, and our knowledge continues to expand beyond these antigens as our methodology for identification of either circulating antibody or immunogenic antigens improves. Indeed, the description of the major donor–recipient blood group incompatibilities in ABO blood group system by Dr. Starzl represents clearly the identification of circulating factors but not the identification or measurement of these agglutinins. Indeed, the ABO-incompatible transplant performed by Dr. Hume and colleagues showed no immediate function (B into O) and was infarcted on removal. While the ABO group system was easiest to identify and avoid, the evolution from being identical blood groups to compatible phenotypes described in this paper demonstrates the progress of understanding in the last 50 years. Blood group ABO became an absolute criterion for matching donor and recipients, and where systems failed, there were clinical consequences usually immediately identified with hyperacute allograft loss.

Major-histocompatibility-complex (MHC) class I–related chain A (MICA) antibodies were first shown to have a detrimental effect on kidney allograft by Zou et al. A subsequent study showed that the effect of MICA positive antibody recipients did not have inferior outcomes to MICA negative recipients, which supported data from Terasaki et al. , Anti-endothelial cell antibodies (AECA) found in one study were predominantly IgM rather than IgG (66 versus 14%). Other non-HLA antibodies measured include vimentin, glomerular basement membrane protein. The presence of anti-idiotypic antibodies, directed against the idiotype of an HLA antibody, has been associated with a reduction of chronic rejection despite the presence of donor alloantibody, although its role is still controversial. More recently, angiotensin II type 1 receptor antibodies have been measured and associated with worse outcomes. Larger studies are needed to determine the clinical effectiveness for monitoring and the longer-term outcomes. , Myosin and vimentin have been associated with post-heart transplant recipients who develop chronic cardiac allograft vasculopathy, a pathologic process associated with alloimmunity and chronic antibody rejection, and this is supported by independent studies that associated anti-myosin antibodies with cardiac allograft vasculopathy and support longitudinal testing in order to assess immunologic risk of the recipient. These non-HLA antibodies against allografts described above are not widely used in routine clinical practice or validated for routine work. Further development of assays to measure the interaction of individual immune systems with allografts described by Gates et al. would allow specific monitoring without the need for biopsies.

Anti-Gal antibodies, the immunologic barrier to xenotransplantation, are produced by natural antibody-producing B cells. Anti-Gal antibodies are about 1% of circulating IgG and 4% of IgM. Large amounts of IgM are deposited in xenografts in rejection. The presence of anti-Gal IgM and IgG can increase with exposure to antigen. In the context of allotransplantation, not xenotransplantation, anti-Gal antibody does not affect allograft outcomes; however, there is a suggestion that anti-Gal binding may overlap with that of other blood group antibodies (A or B), as the epitopes have similar structures, though the clinical relevance of this has not been demonstrated.

The discovery and development of HLA identification improved outcomes with reduction in immediate allograft failure (see Chapter 97 for additional discussion on Transplant Compatibility Testing). More recently, the application of anti-HLA antibody-specific measurement in the form of single-antigen beads—being able to identify donor-specific antibodies (DSA) and improvements in HLA typing—has helped to reduce the early immunologic risks of transplantation. Identifying risk may increase waiting time but reduces positive crossmatches and likely improves the successful utilization of organs with reduced risk of rejection. Previously, complement-dependent cytotoxic (CDC) crossmatches were deemed the standard for allowing a transplant to progress, but measurement of the “negative” CDC crossmatches has demonstrated the significant donor-specific burden of antibody and subsequent allograft failure. Anti-HLA Class I and Class II antibodies against donor antigens are associated with shorter allograft survival, the latter being more correlated with kidney transplant failure than Class I DSA. ,

In lung transplants, in addition to anti-HLA, vimentin, and MICA antibodies, collagen V, perlecan, Kα1-tubulin, and Angiotensin II receptor type 1 receptor (AT1R) antibodies have been implicated in antibody-mediated rejection (AMR). , Liver transplants do not have the same susceptibility to alloimmune responses as other solid organs. , While they are still susceptible to injury—for example, ABO-incompatible liver transplant biliary structural damage—this function is due more to immunologic privilege than to organ antigen expression. , ,

Blood group incompatibility and positive crossmatch transplants

Blood group incompatibility was previously thought to be an absolute contraindication for transplantation (see Chapter 90 —“Blood group systems and pretransfusion compatibility testing” for further discussion). However, inadvertent breaches (through clinical errors) of this blood group ABO barrier took place, and successful outcomes describe therapies to reduce blood group antibodies to allow the transplant to continue to function. , For example, Slapak et al. reported intravascular coagulation in an “A” kidney into “O” recipient and using plasma exchange to rapidly reduce both IgG and IgM, the kidney allograft improved to normal function, challenging the practice of using blood group incompatible barriers in all cicumstances. Subsequently, Alexandre et al. published a series of 26 ABO-incompatible kidney transplants with relatively good outcomes. Thus protocols and techniques were developed mainly in living donor kidney transplantation that allow kidney donation across the ABO barrier. These techniques, termed “desensitization” protocols, have largely revolved around three broad principles: removal of circulating antibodies (plasmapheresis or immunoadsorption), depleting B cell lymphocytes using CD20, and thirdly immunomodulation (using intravenous immunoglobulin). The largest series have been published in Japan, where the practice is most well established.

Frequently, the desensitization protocol begins around one month prior to the transplant and involves a combination of rituximab, plasmapheresis, and intravenous immunoglobulin to deplete B cells and reduce circulating antibody levels in order to achieve an isoagglutinin antibody titer ≤1:8 before transplantation is performed. Following the transplant, the follow-up is similar to ABO compatible transplants, with the additional need to follow isoagglutinin antibody titers in a more systematic process. , In ABO-incompatible (ABOi) kidney transplantation, clinical management is directed by the ability to monitor patients with serologic tests following transplantation, which often demonstrate low levels of anti-A/B specific antibodies. , However, follow-up in a Japanese ABOi kidney transplantation cohort has demonstrated that over a longer period, there is a reduction in A/B antigen expression on biopsies of the donor kidneys, and a profile similar to the recipient blood group is seen.

The need to consider ABOi living kidney transplants has diminished with the increased practice of living donor kidney paired exchange; there is still opportunity for deceased donor ABOi transplants with mainly blood group “A” kidneys into blood group “B” recipients with low titers, which is aimed to address the longer waiting time for deceased donor kidneys for recipients with blood groups “O” and blood group “B” compared to blood group “A.” Using low antigen-density “A2” kidneys across blood group barriers with low anti-A titers is a relatively low immunologic risk compared to “A1” donors. Analysis of the UNOS database for “A2” donor kidneys showed that a significantly higher nonwhite proportion of recipients benefitted when “A2” kidneys were used across A to B blood group barriers with similar clinical outcomes for patient or allograft survival. It has also been successfully implemented by quarterly measuring the ABO titer in recipients in the United States with an increase of 25% more “B” recipients receiving transplants. Modeling of the waiting kidney transplant list in the UK demonstrated that using ABOi deceased donors could improve HLA matching and positively impact longer-term outcomes. Implementation of ABOi kidney transplantation in deceased donor transplantation with high anti-A/B specific titers against the donor is difficult to achieve due to organ retrieval time and the time needed for antibody removal, so it is not widely expanded to other solid organ transplants.

Anti-HLA testing has been the barrier to transplantation, originally was reported by Patel and Terasaki in 1969 where the CDC crossmatch was positive in parous females and associated with transplant failure. Determining the importance of crossmatching prior to transplantation in order to avoid very early failure made a significant improvement to transplant outcomes. As described earlier, the sensitivity of anti-HLA testing has improved with now flow cytometric crossmatches performed for sensitized kidney transplant recipients, but virtual crossmatches are becoming more common—utilizing the better-defined HLA testing between donor and recipient. , The potential impact of anti-donor antibody can be measured by the different serologic testing methods for donor and recipient matching, with increased reactivity increasing the risk of rejection and poor transplant outcomes. ,

The development of anti-donor antibodies occurs through germinal center activation of B cell by activated CD4+ T lymphocytes. CD40/CD40L interaction triggers cytokine release from the effector T lymphocyte, causing proliferation and class switching of the B lymphocytes. Repeated antigen exposure leads to antibody class switching with DNA sequence marking all the constant regions for each class. The human sequence of class switching after IgM and IgD is as follows: IgG3; IgG1; IgA1; IgG2; IgG4; IgE; and IgA2. However, different isotypes can occur at the same time during maturation and proliferation of the B cell. Importantly, the constant region of the antibody affects the immunologic function; thus the Fc region of the antibody is recognized by the Fc receptors, which initiate a response in specific effector cells (such as macrophage and neutrophils) and bind complement. Antibody production is either short-lived (plasmablasts) or long-lived (plasma cells), and the latter find their niche in the bone marrow and survive through anti-apoptotic mechanisms, which include stimulation through CXCR4, APRIL, and IL-6. The determinant of pathways is important as long-lived plasma cells continue to produce antibodies, while short-lived plasmablasts will have a finite life span, and if there is no further stimulation of B cells, antibody production will be halted. Understanding of the mechanism is important (1) to determine the longevity of these cells which play a crucial role in persistent alloantibody production and (2) guides appropriate therapeutic interventions. , While IgG is the predominant form of affinity matured antibody population, the most commonly measured, and its presence has been associated with acute antibody-mediated rejection (AMR), the presence of other isotypes, such as IgM, has also been associated with AMR. The presence of anti-donor IgM has been described by a positive result in CDC or flow crossmatch (FCX), which became negative on dithiothreitol (DTT) treatment or heat treatment. Arnold et al. showed that there was a low prevalence of IgM (17%) anti-HLA antibodies in patients waiting for transplantation, and the distribution of IgM antibodies was more against class I than class II HLA specificities.

Recently, DSA IgG has been eluted from transplant biopsies and correlates well with the specificity of the circulating DSA IgG in serum, complement activation fragment C4d deposition in the tissue, and features of antibody-mediated injury. The presence of anti-HLA IgG has also been shown to be deleterious to the graft regardless of whether it was donor-specific or non–donor-specific, suggesting an upregulation of immune cells not currently measured. Natural Killer cells have been identified as associated with AMR in the absence of anti-HLA antibodies.

Currently, different programs use ABO incompatibility in different ways. In living donor kidney transplantation, ABOi tends to be avoided using paired exchange programs, avoiding the incompatible pair, which can be part of a direct two-way exchange or more of a complex exchange program ( Fig. 95.1 ).

FIGURE 95.1, Avoiding incompatible kidney donor-recipient pairs can be possible through kidney exchange programs which can be either direct pairs (A) or increase numbers of pairs (B) to allow successful compatible transplants.

Living donor kidney transplantation (a planned transplantation date) with a donor against whom the recipient has antibodies can be prepared to reduce the immunologic risk at the time of transplant. Different therapeutic options have been used in order to lower the immunologic risk, including plasmapheresis, intravenous immunoglobulins (IVIg), anti-B cell therapies, and anti-plasma cell therapies. While these have successfully allowed transplantation into recipients, the clinical outcomes have been less favorable than compatible donor-recipient pairs. Similar to ABOi pairs and to improve outcomes, the use of kidney paired exchange programs have aimed to avoid donors against which there are DSA.

Despite the lower allograft survival in these cohorts, there is still an ethical and quality-of-life component to these discussions. The ability to match some patients who are highly sensitized against HLA antigens through prior transplantation, pregnancies, or blood transfusions will be very low. In both paired exchange programs and in the deceased donor pool, the rate of transplantation in these highly sensitized (calculated Panel Reactive Antibodies [CPRA] > 99.96%) patients in kidney transplantation can be as low as 7%, compared to around 25% if CPRA is less than 99.5%. Thus patient experience and quality of life, degree of risk, and patient survival need to contribute to discussions regarding optimal choices with regards to dialysis or transplantation. In patients in the United States, receiving a positive crossmatch kidney transplant demonstrated increased patient survival compared to a matched cohort who waited for a transplant, whether they subsequently received a transplant or continued on dialysis. In a UK equivalent study, patient survival was demonstrated not to be inferior to those who were transplant-listed and subsequently received a deceased donor kidney.

Ischemia-reperfusion and organ retrieval

Deceased donor organ retrieval is a vital component of transplantation. Timing of organ retrieval is important and differs among solid organs, with heart and lungs taking priority over abdominal organs. Warm ischemic time, time from declared death to organ procurement is associated with increased injury as hypoxic changes with associated increased metabolism of the organ. Cold ischemic time and time during organ transportation are better tolerated by abdominal organs. There are concerted efforts to improve the procurement, storage, and transportation of donor organs to reduce the ischemic impact on organs and organ function. Initial developments to improve organ preservation and reduce hypoxic injury were to use hypoxic static cold storage, using a preservation fluid with ice-cooling for transport and storage. This technique has progressed to use organ machine perfusion, which allows clinicians to both monitor fluid flow rates and vessel pressures to assess the viability of organs and potential to allow adjustment to improve parameters prior to transplantation. The temperatures of the organ storage can be divided into different categories: hypothermic (1 to 8 °C), sub-normothermic MP (20 to 35.5 °C), and normothermic (35.5 to 37.5 °C). More recently, major endeavors have been investigating different perfusion models or the development of normothermic technique. Organ preservation and machine perfusion have recently been reviewed by Xu and colleagues but are briefly highlighted below. Importantly different organs have different thresholds for storage, with thoracic organs requiring much shorter times than abdominal organs.

Kidney perfusion models have been developed to look at perfusion ex vivo as a predictor of improved allograft function post-transplant. Current randomized controlled studies are being undertaken to assess the impact of using higher risk profiled kidneys either using standard cold fluid storage or normothermic perfusion to assess renal recovery post-transplantation, having developed this technique to improve organ performance in rejected kidneys for transplantation. Interventions to adjust the oxygenation of the perfusate using a normothermic system have impacted the physiologic function of the organ in experimental models.

Liver normothermic perfusion models are being developed to improve clinical outcomes but still are not available in clinical practice in the United States. The goal is to (1) reduce the complications related to ischemia in the recipient and (2) to improve the organ procurement rate reducing the discard rate due to poor organ quality. This is achieved by delivering oxygenated blood, medications, and nutrients at normal body temperature in attempt to establish a normal physiologic environment. The complications related to organ quality are livers that never function, known as primary nonfunction, or early structural complications such as biliary strictures. A pan-European study demonstrated significant reductions in early allograft dysfunction and peak liver enzymes, which are markers of long-term graft and patient survival compared to cold storage transportation. Kidneys deemed nontransplantable have been restored using normothermic perfusion with acceptable clinical outcomes—a potential answer to high discard rates. Increased technology to reduce extra-corpus injury on the allografts is a focused target of improving organ procurement but may also allow improvement in organ quality or at least establish viability of organs prior to transplantation. The increasing demand for organs would benefit from mechanisms of allograft assessment, therapeutics, and transportation in order to increase accessibility to deceased donor organs. Perfusion machinery could be used to change the endothelial surface, induce immunosuppression locally, or enable the treatment with metabolic factors to reduce the impact of procurement.

Lung perfusion machines, pioneered in Toronto, Canada, have ongoing establishment of clinical application. Defining the important parameters to establish usage of ex-vivo lungs post perfusion is still to be determined and largely driven by clinical acumen rather than equal balance of specific parameters. Lung compliance and oxygenation are two of the most relevant variables used to define quality; however, these are highly dependent on the level of lung parenchyma recruited. Similar to other organs, normothermic and hypothermic perfusion of cardiac allografts have been tested in both animal models and discarded cardiac allografts. Successful normothermic cardiac transplantation has occurred. The benefit of normothermic perfusion is the removal of toxins in the perfusate prior to transplantation so that ischemia-reperfusion injury and cardiovascular instability are reduced. The developments of perfusion machinery to not only improve transport of retrieved organs for transplantation but to allow both assessment and potential intervention continue to drive research. , The improvement in machine perfusion could lead to increased usage of DCD cardiac donation.

Immunosuppression in transplantation—the past, present, and future

Suppression of the immune system was the missing link in transplantation, and studies focusing on modulating the immune system to be tolerant of the allograft were the main goal and were progressive in nature. The multiple pathways in T-cell development are the targets of the immunosuppression regimes ( Fig. 95.2 ) which have made the most impact on allograft outcomes.

FIGURE 95.2, Mechanistic pathways (see Fig. 95.15 ) in T-cell activation and targets for immunosuppressive therapies. Antigen-presenting cells present donor antigen to naïve and central memory T cells and are activated by three major signaling pathways leading to clonal expansion and differentiation to express effector functions. Antigen triggers T-cell receptors (TCRs) (Signal 1) and synapse formation. CD80 (B7-1) and CD86 (B7-2) on the APC engage CD28 on the T cell to provide Signal 2. These signals activate three signal-transduction pathways—the calcium–calcineurin pathway, the mitogen-activated protein (MAP) kinase pathway, and the protein kinase C–nuclear factor-κB (NF-κB ) pathway—which activate transcription factors nuclear factor of activated T cells (NFAT), activating protein 1 (AP-1) , and NF-κB, respectively. The result is expression of CD154 (which further activates APCs), interleukin- 2 receptor a chain (CD25), and interleukin-2. Receptors for a number of cytokines (interleukin-2, 4,7, 15, and 21) share the common chain, which binds Janus kinase 3 (JAK3). Interleukin-2 and interleukin-15 deliver growth signals (Signal 3) through the phosphoinositide 3-kinase (PI-3K) pathway and the molecular target of rapamycin (mTOR) pathway, which initiates the cell cycle. Replication of lymphocytes requires synthesis of purine and pyrimidine nucleotides, which is regulated by inosine monophosphate dehydrogenase (IMPDH) and dihydroorotate dehydrogenase (DHODH), respectively. During these pathways, various targets are available for therapeutic intervention to prevent T-cell activation. MPA (mycophenolic acid); JAK3 inhibition was discontinued due to complications; anti-CD52 (alemtuzumab); anti-CD154 was discontinued due to increased thromboembolic events; anti-CD3 (OKT3); CTLA-4 (Belatacept).

Steroids have been used since the inception of transplantation, usually at high doses at induction and then maintained. Given the significant systemic side effects steroids have, most current protocols use steroids as part of immunosuppression regimens, titrating down to either low dose (5 mg) or no dose for chronic management. Their role as an induction agent at high dose is still used due to the multiple pathways associated with steroids, including their inhibitory role on cytokine production, reduction in interferon gamma (IFN-γ), tumor necrosis factor α (TNF-α) production, and redistribution of lymphocytes and macrophages to the lymphoid tissue. Due to the chronic side effects, steroid-free immunosuppression regimens are offered, usually in the context of lower immunologic risk, either from transplant immunology or native disease immunology, in the context of autoimmune diseases, for example. Later withdrawal of steroids is associated with poor outcomes, rejection, and development of anti-HLA antibodies. Pascual et al. report a risk ratio of 2.28 (95% CI of 1.65 to 3.16) for rejection with steroid withdrawal in the modern triple immunosuppression era with tacrolimus, mycophenolate, and steroids.

Calcineurin inhibition

Cyclosporine was introduced in the 1970s, and its use increased the 1-year allograft survival for kidneys significantly, according to this report, to 1 graft failure out of 32 kidney transplants. However, the risk of death was high in this cohort (6/32 died in the first year) from infection predominantly ( n = 5), but also early lymphoma ( n = 1), but using cyclosporine significantly affected the effectiveness of transplantation, meaning that organs lasted longer. These infections occurred at the higher cyclosporine doses used in the early clinical practice (25 mg/kg/day) compared to the subsequent dosing of 17 mg/kg/day, where patients did not experience the same complications. The delivery of cyclosporine changed to improve tolerability; early non–water-soluble formats were a significant side effect for patients. Discovery of dose response with therapeutic drug monitoring at trough concentrations led to more standardized dosing with both 2-hour trough and 12-hour trough levels leading to target therapeutics in organ transplantation. While significant benefit in clinical outcomes was demonstrated, commonly occurring side effects from cyclosporine were gingival hypertrophy, hirsutism, hypertriglyceridemia, hypertension, and kidney damage with interstitial fibrosis and tubular atrophy attributing to calcineurin inhibition (CNI) use. For information on the measurement of cyclosporine, refer to Chapter 42 .

Therapeutic options remained similar through to the 1990s when tacrolimus, a derivative in the CNI pathway, was introduced; this remained a twice per day regimen with targeted therapy directed by 12-hour trough levels. The impact of tacrolimus use improved kidney transplant survival at 1 year from 86.6 to 91.1% in 1995 UNOS report with a reduction in rejection compared to cyclosporine (OR 0.52 [0.36 to 0.75]). Similarly, there were reduced rejections in liver compared to cyclosporine (58.6%, 154/263 versus 65.0%, 173/266, P < .002), and steroid-resistant rejection was lower (27.9%, 43/154 versus 47.4%, 82/173, P < .001). In lung transplantation, there are fewer rejection episodes compared to cyclosporine in a meta-analysis; there was less rejection with tacrolimus (mean difference −0.14 [−0.28 to −0.01], P = .04), and more new-onset diabetes in tacrolimus groups (OR, 3.69 [1.17 to 11.62], P = .03). Tacrolimus has been widely applied to most transplant programs worldwide. With the use of tacrolimus, the side-effect profile changed to increased incidence of diabetes, tremor, and infections. In liver transplant recipients treated with tacrolimus, there was a twofold increase in malignancy in multivariate analysis compared to cyclosporine (hazard ratio 2.06). The use of tacrolimus-based regimens in the United States (kidney 95.3%, heart 96.6%, lung 85.5%, liver 77.5%) in 2018 demonstrates the significance of maintenance calcineurin immunosuppression on clinical transplantation management.

mTOR inhibition

The group of drugs named mTOR inhibitors (mammalian target of rapamycin) was introduced as an additional pathway inhibitor to CNI use to avoid complications related to CNI. The initial clinical results demonstrated significant problems clinically, including impaired wound healing, lymphocele development, proteinuria, hyperlipidemia, peripheral edema, pneumonitis, and stomatitis. mTOR inhibitors have been used in multiple solid organ transplants and as an inhibitor in neuroendocrine tumors, breast cancer, renal cancers, and renal angiomyolipomas. The mechanism of action is a proliferation signal inhibitor, preventing the interleukin-2 (IL-2)–initiated proliferation of T cells. Significant profile similarities exist between the two drugs, including high interpatient variability, narrow therapeutic index, poor correlation with dose, and systemic exposure largely dependent on wide tissue distribution.

Despite lower concentrations, the second-generation mTOR inhibitor everolimus has been shown to be significantly more effective at inhibiting mTORC2, a class I stimulated pathway that regulates endothelial cell function associated with vasculopathy. There is a reduced endothelial-dependent smooth muscle relaxation with sirolimus in comparison to everolimus. More recently, sirolimus, another drug in this group, has been used as an alternative to tacrolimus in heart transplant patients with success in reducing the risk of vascular disease and the development of malignancy. ,

Side-effect profiles are difficult to compare as there are limited randomized control trials directly comparing these two drugs. Different studies report that everolimus has fewer side effects than sirolimus, but others report similar percentages. , Importantly, poor surgical wound healing has been shown to be lower with everolimus than with sirolimus.

Early clinical trial data with sirolimus may be clouded by higher therapeutic targets than currently used and with increased toxicity with cyclosporine use; these facts may reduce the ability to apply to current practices. Nevertheless, the role of sirolimus has been shown to benefit greatly in the reduction of tumor burden and particularly with secondary prevention of skin cancers with around a 50% reduction in squamous cell, basal cell, and other skin cancers ( P < .05); however, the side-effect profile was significantly high, specifically with peripheral edema, skin and mouth lesions, and new-onset proteinuria and dyslipidemias ( Table 95.1 ), reflecting the difficulties in managing mTOR inhibitors in clinical practice. In addition, a meta-analysis demonstrated an increased mortality risk for those using sirolimus. For information on the measurement of the various immunosuppressant drugs, refer to Chapter 42 .

TABLE 95.1
Side Effects of Sirolimus in Treated Patients for Secondary Prevention of Skin Cancers
Side Effect of Sirolimus Number Percentage
Edema 37 57.8%
Acne-like lesions 28 43.8%
Aphthous ulcers 24 37.5%
Proteinuria 20 31.3%
Diarrhea 17 26.6%
Dyslipidemia 15 23.4%
Pneumonitis 14 21.9%
Anemia 12 18.8%
Cough 10 15.6%
Arthralgia 10 15.6%
Worsening of hypertension 9 14.1%
Leukopenia 7 10.9%
Bronchitis 7 10.9%
Urinary tract infection 7 10.9%
Exercise dyspnea 6 9.4%
Unexplained fever 6 9.4%
Herpes simplex 5 7.8%
Rash 5 7.8%
Thrombocytopenia 5 7.8%
Drug discontinued due to side effects 15 23.4%

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here