Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Large-diameter corneal grafts are used for the treatment of uncontrolled corneal ulcers, keratomycoses, and other severe necrotizing corneal conditions that threaten vision or the eye itself.
Patients with severe ocular surface disease manifesting as conjunctivalization and superficial neovascularization, which may occur due to limbal stem cell deficiency, are extremely high-risk for immunologic and nonimmunologic graft failure. In these patients, the appropriate strategy is to first treat the surface disease and then proceed to keratoplasty.
Topical tacrolimus/cyclosporin should be considered routinely for all patients with high-risk keratoplasty.
Systemic mycophenolate mofetil is an effective treatment in high-risk keratoplasty as a single agent for long-term maintenance therapy. Given its safety profile, it is preferred by these authors as a first-line systemic agent for high-risk keratoplasty.
Systemic cyclosporine or tacrolimus may be used as a second agent along with mycophenolate, or as a single agent particularly in atopic/vernal patients in preparation for surgery.
In high-risk patients who continue to experience corneal graft rejection on a single systemic agent, the Cincinnati Immunosuppression protocol may be considered to achieve adequate immunosuppression, while minimizing the individual dosages of medications.
Novel immunomodulation strategies (e.g., those that lead to inhibition or reduction of hemangiogenesis and lymphangiogenesis) are promising options with a favorable safety profile that might replace/complement immunosuppression in high-risk corneal transplantation in the near future.
Corneal transplantation is the most common form of solid tissue transplantation. The high success rate of uncomplicated first grafts performed in avascular “low-risk” beds contrasts sharply with the results of corneal grafts placed in so-called “high-risk” beds in which rejection rates can approach 70%, even with maximal local immune suppression. , Immune-mediated rejection remains the leading cause of corneal transplant failure. Although the emergence of lamellar keratoplasty has improved overall graft survival, there remain a number of factors associated with a higher risk of rejection. , This chapter reviews the immunology of graft rejection, discusses prognostic factors, and examines the clinical and experimental approaches for avoiding graft rejection in patients who are at increased risk of immunologic graft failure. In addition, large-diameter corneal grafts are discussed in more detail.
Two important facets have shaped our understanding of why keratoplasty succeeds in some cases and fails in others. First, the unique anatomy of the cornea is responsible for the limited induction of immunity (the afferent arc) and expression of an immune response (the efferent arc). , Second is the degree of vascularization and the condition of the host bed, such as in high-risk keratoplasty.
The genes that encode the major transplantation antigens are located within the major histocompatibility complex (MHC) and, individually, are called human leukocyte antigens (HLAs). Major antigens induce a more vigorous immune response than minor antigens. Class I genes (HLA-A, B, and C) are expressed by all cells while class II histocompatibility antigens (HLA-DP, DQ, and DR) are found on specific immunocompetent antigen-presenting cells (APCs) of the lymphoreticular system: dendritic cells (DCs), epithelial Langerhans cells (LCs), macrophages, and B-cells. ,
Minor transplantation antigens are encoded by genes outside the MHC at numerous loci. In the cornea, ABO blood group antigens are expressed by epithelial cells and are upregulated during graft rejection.
Several unique anatomic and physiologic features of the cornea and the anterior chamber contribute to the overall success of corneal transplantation. The lack of corneal vascularity limits access of the immune system. The absence of corneal lymphatics prevents high-volume delivery of antigens and APCs to T-cell reservoirs such as lymph nodes (LNs). The low expression of MHC antigens by the cornea limits the targets of the immune response. Ocular expression of a unique range of immunomodulatory factors and neuropeptides, including transforming growth factor (TGF)-β26 and α-melanocyte stimulating hormone (α-MSH), inhibits T-cell and complement activation. The expression of CD95 (Fas) ligand can induce apoptosis of stimulated Fas+ T-cells. Antigens placed in the anterior chamber can cause selective and adoptively transferable suppression of the systemic immune response, known as anterior chamber–associated immune deviation (ACAID). ,
The cornea was previously thought to be devoid of resident APCs, including LCs, which was assumed to be a critical component of corneal immune privilege. Although the cornea is endowed with resident DCs that are universally MHC class II negative, they are capable of expressing class II antigen after inflammation or transplantation. , ,
High-risk keratoplasty is characterized by a loss of corneal avascularity, the establishment of lymphatic drainage to cervical LNs, migration of LCs into the cornea, and maturation of resident epithelial LCs and stromal DCs, which then may function as APCs to enhance immune surveillance. , Other features include upregulation of proinflammatory cytokines such as interleukin (IL)-1 and tumor necrosis factor (TNF)-α. Both IL-1 and TNF-α have been shown to suppress immunomodulating pathways including ACAID ; to augment MHC expression and maturation of DCs ; and to upregulate the expression of adhesion factors and chemotactic factors. , There is overwhelming evidence that nearly every aspect of the normal corneal and ocular physiology in the normal setting is lost in high-risk corneal graft recipients.
The process of corneal transplant rejection includes an induction phase, called the “afferent” arm, and an expression phase, called the “efferent” arm. In the afferent arm the host becomes sensitized to the donor antigens by means of APCs (e.g., MHC class II-positive DCs, LCs, macrophages, etc.) that present antigens to T-cells in draining LNs ( Fig. 126.1 ). The allorecognition process involves two different pathways ( Fig. 126.2 ). , The direct pathway involves donor APCs that sensitize the host directly when T-cells recognize the donor class II MHC and thus generating direct alloreactive T-cells. , The indirect pathway involves host APCs that go to the graft, take up donor antigens, migrate to draining LNs, and then present their antigens in the context of “self” class II MHC to the receptors of naive T-cells. , There is, however, accumulating evidence that host sensitization to donor antigens of corneal grafts occurs through both pathways of sensitization, especially in high-risk corneal grafting. , Furthermore, the critical role of draining cervical LNs in the process of allosensitization has been clearly demonstrated. Upon arrival in the LNs, APCs upregulate surface expression of costimulatory molecules (e.g., ICAM, B7, CD40) and also secrete cytokines, including IL-12, both of which are required for activation of T-cells.
The efferent phase is responsible for the “attack” on the graft. This phase consists of the proliferation of alloreactive T-cells in lymphoid organs, delivery of these cells to the cornea, and the development of “memory” that can assist the allo-immune response in case of repeated exposure to the same antigens. CD8 + cytotoxic T-lymphocyte (CTL) and CD4 + T-helper (Th) cells are primary immune effector mechanisms implicated in the rejection of solid tissue grafts. CD4 + Th1 cells are the primary mediators of the efferent arm and act directly as effector cells in corneal graft rejection. , IL-2, secreted by these cells, stimulates the activation and proliferation of other T- and B-cells, whereas interferon (IFN)-γ activates macrophages and induces expression of class II antigens in the donor tissue. The role of CD8 + T-cells in the rejection of allogeneic corneal grafts remains under study. It has been demonstrated that corneal graft rejection may occur in a CD4 + T-cell–independent fashion, and that CD8 + T-cells can contribute to graft rejection, although corneal grafts can still be rejected in their absence. A subpopulation of T-cells, the so-called T-regulatory cells, have been shown to prolong corneal graft survival. These cells primarily suppress the induction of allo-immunity in regional draining LNs rather than suppressing the effector phase of the immune response in the periphery.
A major factor directing the recruitment of T-cells and other leukocytes into grafts is the production of chemo attractant cyto kines —chemokines. Chemokines function in concert with other molecular mediators including integrins and other adhesion molecules to direct the immune response toward the graft. , Recipients of high-risk transplants express very high levels of the IP-10 chemokine. Although donor-specific antibodies have been detected in host serum after corneal grafting, rejection can occur in the absence of antibodies. , Nevertheless, the involvement of antibodies remains controversial.
Recently, the regulatory role of mast cells in both innate and adaptive immune responses by promoting allosensitization has been reported. Increasing numbers of mast cells following transplantation indicates their contributions in allograft rejection at the ocular surface.
Vascularized corneas have a much higher risk of graft rejection than avascular corneas ( Fig. 126.3 ). The Collaborative Corneal Transplantation Studies (CCTS) defined “high-risk” as a cornea with two or more quadrants of deep stromal vascularization. In the CCTS, the risk of rejection was doubled in patients with stromal vascularization in all four quadrants.
Corneal stromal vascularization
Prior graft loss, especially from allograft rejection
Increased graft diameter and eccentric grafts
Anterior synechiae
Previous intraocular surgery
Herpes simplex and herpes zoster keratitis
History of anterior segment inflammatory disease
Ocular surface disease
Young age, especially infants and children
Glaucoma
Corneal hemangiogenesis and lymphangiogenesis facilitate immune cell migration to/from the graft, therefore promoting the development of rejection. The degree and depth of preoperative corneal vascularization determines the onset and severity of rejection. The average time between surgery and rejection was shown to be approximately 10 months in avascular corneas, 4 months in mildly vascularized corneas, and 2 months in heavily vascularized corneas. It has been observed that allograft reactions are more likely with deep stromal neovascularization than with superficial neovascularization, and deep vessels may be associated with leukocyte adhesion to apparently normal endothelial cells ( Fig. 126.4 ). Once corneal rejection occurs, the likelihood of reversal also depends on the degree of corneal vascularization.
Mechanical ablation of vessels using methods such as fine needle diathermy, argon laser photoablation, and cryotherapy has been described, although the effects are temporary. ,
The Cornea Donor Study (CDS) has shown an increased risk of graft failure in those with a history of a definite graft rejection episode. The 10-year rejection rate among those with no rejection episodes was 12%, versus 22% with a history of at least one definite rejection episode. The patients in the CDS primarily had either Fuchs dystrophy or pseudophakic bullous keratopathy as the indication for transplantation. Recently, in a retrospective case review, pseudophakic corneal decompensation was reported as the most common indication for lamellar keratoplasty. In this study, Fuchs dystrophy and prior corneal transplant failure were the main risk factors for graft failure.
Prior corneal graft failure is a significant risk factor for future corneal transplant rejection, especially if graft failure was a result of an allograft rejection. , Rejection rates in patients with comparably vascularized recipient beds are approximately 40% after the first graft, 68% after the second graft, and 80% after the third graft. Graft rejection occurs earlier and follows a much more fulminant course in regrafts than in first grafts. In the CCTS, the number of previous grafts was a strong risk factor for graft failure, with each additional graft increasing risk by a factor of approximately. , ,
There are several possible explanations for the observation that prior graft loss is a major risk factor. First, the residua of previous surgery such as corneal neovascularization and peripheral anterior synechiae may increase the chance that subsequent grafts will fail. Second, immune mediators may be localized more strategically following previous graft rejection, so that the stage is set for more efficient recognition and rejection of foreign tissue. Third, pre-sensitization may occur when antigens of donor and recipient are shared such that the afferent blockade is circumvented.
More recently, there have been studies suggesting that any corneal transplant, regardless of outcome, may potentially increase the risk of subsequent transplant rejection in either eye. This is based on experimental studies that have shown that severing of corneal nerves during corneal transplant surgery abolishes immune privilege of subsequent corneal transplants placed into either eye. This has been termed sympathetic loss of immune privilege (SLIP).
Increased corneal graft diameter has been reported to be a significant prognostic factor, presumably because larger grafts are closer to the limbal vasculature and contain more antigenic material. The correlation between graft diameter and subsequent failure may be strong in patients who have had a previous rejection if the surgeon increases the graft size to ensure that the opacified cornea is completely excised. A 46% 4-year survival probability in 17 grafts (16 eyes; 15 patients) of 10 mm or more in diameter was reported and, in another study, a successful outcome in 11 of 15 eyes (13 patients) was demonstrated.
Outcomes of large-diameter penetrating keratoplasty (graft size 8.75–10.0 mm) in 35 eyes of 32 patients have been reported. At last follow-up, 33 of 35 grafts remained clear. The postoperative regimen included a topical corticosteroid and topical cyclosporine 0.05%; the authors felt that the addition of the latter may have contributed to the relatively low rate of graft rejection and failure.
Two studies have shown that smaller grafts were actually at greater risk of rejection, , while other studies showed that graft rejection was statistically independent of graft size. , , Large grafts are discussed in more detail later in this chapter.
Direct contact of the graft with the host vascular system through peripheral anterior synechiae is believed to increase the risk of graft failure in the presence of three or four quadrants of iris synechiae. In the CCTS, the failure rate from any cause doubled if the eye had three or four quadrants of anterior synechiae. Experimental models have further shown that anterior synechiae impair ocular immune privilege. The poor prognosis associated with anterior synechiae may also be related to nonimmunologic factors. For instance, eyes with anterior synechiae have an increased incidence of glaucoma, or the synechiae may cause traction on the corneal endothelium, , both of which may lead to endothelial cell loss.
Isolated descemetorhexis prior to endothelial keratoplasty in a case of central iridocorneal synechiae was suggested to prevent central iridocorneal synechiae.
Previous intraocular surgery was also associated with graft failure in the CCTS. Lensectomy, vitrectomy, and procedures to control intraocular pressure (IOP) were identified as risk factors. The residua of previous surgery such as anterior synechiae, corneal neovascularization, poorly controlled IOPs, and chronic inflammation may have contributed to graft failure. Concomitant vitrectomy at the time of keratoplasty has likewise been found to increase the risk of immunologic rejection. Previous glaucoma surgery is also a risk factor for graft failure, which has been demonstrated by the CDS.
Several studies have shown that herpetic keratitis predisposes an eye to immunologic graft failure. , An animal experiment of corneal graft in a mouse model of herpetic keratitis showed that both syngeneic and allogeneic corneal grafts were rejected in preexisting herpetic keratitis. Previous herpetic keratitis accelerates the graft failure by innate and adaptive immune systems and predisposing the graft to a high-risk rejection. In a retrospective chart review, the recurrence and graft rejection rate of 42 patients who underwent penetrating keratoplasty (PKP) following herpetic keratitis as a high-risk corneal graft were analyzed; however, the recurrence and graft rejection rate were different in perforated and quiescent herpetic corneal scars, whereas the graft survival rate at 3 years was similar. A retrospective review of 87 herpes simplex cases treated with systemic acyclovir and immunosuppression with cyclosporine A (CsA) or mycophenolate mofetil showed a rate of graft survival comparable with normal-risk keratoplasty.
Favorable outcomes with penetrating keratoplasty in patients with herpes zoster ophthalmicus were reported. It was felt that appropriate patient selection and a longer quiescent period increased the likelihood of graft success. Overall, while the outcome of initial grafts for quiescent herpetic scars is reasonable, patients with active inflammation, ulceration, perforation, or extensive neovascularization have particularly poor success rates.
Inflammation at the time of penetrating keratoplasty is a significant risk factor for graft failure. Any inflammatory response in the anterior segment can facilitate the afferent and efferent arms of the allograft reaction by inducing the production of cytokines, upregulating the expression of HLA-DR in the graft, and facilitate the migration of lymphocytes by increasing expression of adhesion molecules. Retrospective studies have demonstrated a less favorable outcome when keratoplasty is performed for acute problems such as corneal perforations. Whenever possible, keratoplasty should be delayed until the inflammation is well controlled and the eye has remained quiet for 6–12 months or longer depending on the underlying diagnosis. Common immune disorders that are at particular risk for worsening inflammation after penetrating keratoplasty include ocular mucous membrane pemphigoid (MMP), Stevens-Johnson syndrome (SJS), uveitis, and collagen vascular diseases with ocular involvement.
A condition that deserves a special note is atopic disease. Patients with atopic keratoconjunctivitis frequently have chronic inflammation that can be exacerbated as a result of surgery. A typical scenario is an atopic patient with keratoconus who develops a significant immune response following penetrating keratoplasty. This inflammatory response has been termed postkeratoplasty atopic sclerokeratitis. These patients are considered immunologically high-risk and require appropriate pre- and postoperative care. It is important to note that patients with severe and long-standing atopic disease are prone to develop limbal stem cell deficiency, and this should be addressed before penetrating keratoplasty is considered.
Historically, chemical injuries have been considered a risk factor for corneal graft failure. In the CCTS, grafts failed in 26 of the 39 patients (67%) with chemical injuries. More importantly, patients in the CCTS with chemical injury were 3.5 times more likely to have nonrejection graft failure than other high-risk eyes. It is well recognized now that most of these patients probably suffered from limbal stem cell deficiency and, in that setting, a penetrating keratoplasty was likely to fail due to ocular surface disease. While it is true that these patients were also at higher risk for immunologic rejection, the appropriate strategy is to first treat the surface disease with a limbal stem cell transplantation.
In addition to chemical injuries, this principle applies to conditions such as SJS, MMP, and congenital aniridia. Conjunctivalization and superficial neovascularization are hallmarks of limbal stem cell deficiency (see Fig. 126.4 ), which should be distinguished from deeper stromal neovascularization that is frequently present in rejected corneal grafts (see Fig. 126.3 ). Except in emergent cases, penetrating keratoplasty is contraindicated in the presence of significant limbal stem cell deficiency. The appropriate strategy is to first perform a limbal stem cell transplant followed by a subsequent keratoplasty if still necessary.
Studies on keratoplasty after limbal stem cell transplantation mostly evaluate the keratolimbal allograft (KLAL) procedure. When a KLAL has been performed, a subsequent corneal graft is still considered to be at high risk for rejection. In one study, 16 out of 45 (35.6%) eyes receiving simultaneous penetrating keratoplasty and KLAL developed endothelial rejection. The risk of corneal graft rejection may be lower if penetrating keratoplasty and KLAL are performed separately, but the risk is still considerably higher than in cases without associated surface disease. Successful visual outcome has been reported in patients undergoing penetrating keratoplasty and deep anterior lamellar keratoplasty (DALK) following stem cell transplantation. Underlying limbal stem cell deficiency should be addressed prior to keratoplasty to provide better outcome. In addition to limbal stem cell deficiency, patients with severe ocular surface disease frequently have associated aqueous tear deficiency, mucin deficiency, and lid abnormalities, all of which contribute to a poor prognosis for corneal transplantation.
In the CCTS, the age of the recipient was a significant risk factor for graft rejections. Patients younger than 40 years had twice the risk of rejection and graft failure compared to their older counterparts. Interestingly, recent data from the CDS showed trends toward a higher rate of graft failure in patients 70 years or older versus those younger than 60 years. There are a number of potential nonimmunologic reasons for the lower success rate of penetrating keratoplasty in children, such as the difficulty in performing adequate follow-up examinations and the child’s inability to report symptoms. In a retrospective case series, penetrating keratoplasty was done among 574 children aged younger than 18 years. The best graft survival rate after 60-month follow-up was reported among patients with keratoconus with the rejection rate of 27%. In a retrospective case series, the outcome of penetrating keratoplasty in congenital hereditary endothelial dystrophy was evaluated. Recently, in a Corneal Preservation Time Study (CPTS), 1330 eyes of 1090 cases undergoing Descemet stripping automated endothelial keratoplasty (DSAEK) were assigned to receive a donor cornea. The cumulative graft rejection was 3.6% and graft rejection was significantly higher in younger recipients (16.7% in 15 recipients younger than 51 years).
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here