Despite the relative immune privilege of the cornea as a transplant tissue (both the recipient corneal bed and the anterior chamber are immune-privileged sites) the most common cause of corneal graft failure in all reports is allogeneic rejection. In first-time graft recipients with no vascularisation of the recipient's corneal bed, 2-year survival rates exceed 90%; this decreases to 35% to 70% in recipients with high-risk factors for rejection. In one-third of all corneal grafts fail; signs of a destructive attack by the immune system have been observed. A rejection episode results in loss of donor endothelial cells, which are critical for maintenance of corneal transparency. As human endothelial cells do not repair by mitosis, the consequence is that donor corneal transparency is lost if cell density falls below the threshold necessary for prevention of stromal swelling. Endothelial decompensation results either from an irreversible episode of acute graft rejection or at an interval following one or more episodes of rejection which have been reversed by therapy. Endothelial cells are thus the critical target in the allogeneic response. While reversal of acute graft rejection episodes does not present such challenges in the cornea as in other transplanted tissues, effective prophylaxis in corneal graft recipients identified at high risk of rejection is much less evidence-based. Thus, the impact of graft rejection continues to justify a high priority in corneal research. Although the first successful penetrating corneal graft was reported in 1906, it took another half a century before the first description of opacification of a previously clear corneal graft was published. Paufique named this event “maladie du greffon” (disease of the graft) and suggested that sensitization of the donor by the recipient is the cause. This description followed previous experiments reported by Medawar, during which differences were observed between rabbit skin grafts of donor and recipient origin, giving rise to the term “histocompatibility.” Maumenee subsequently confirmed this suggestion in a rabbit model of corneal transplantation in which he showed that donor corneas could induce an immune reaction. This development of corneal transplantation models in the rat and mouse-facilitated studies of rejection in inbred donor and recipient animals showed a wide range of investigative immunological reagents.
Preoperative characteristics of the graft-recipient eye can be identified in many patients to indicate a significantly high risk of graft failure. Proposed graft-recipient corneas with two or more quadrants of deep vascularisation or one bearing a previously rejected graft that is inflamed at the time of transplantation are at significantly higher risk of rejection. There is less robust evidence in the published literature that grafts in children, large-diameter donor corneas, and the proximity of the donor cornea to the recipient limbus cause a higher risk. More than one of these factors may exist in a patient. Furthermore, one or more of the above factors may predispose the patient to rejection due to additional clinical features that confer a significant risk of graft failure. These additional complications include glaucoma or ocular surface disease. These a clinician must evaluate these preoperative clinical features carefully to decide whether to proceed with corneal transplantation. Once transplantation is successfully completed, care must be taken to prevent postoperative events that lead to rejection, for example, vascularization of recipient cornea or graft wound, suture loosening, or graft infection.
In reports from large cohorts of corneal graft recipients, the proportion experiencing rejection at some stage post-transplant ranges from 18% to 21%. In those graft recipients in whom rejection occurs, reported rates of successful reversal of the rejection episode range from 50% to 90%. Allograft rejection occurs most commonly in the second 6 months post-grafting, and it has been reported that more than 10% of the observed reactions can take place as late as 4 years after surgery. This data indicates that all corneal grafts need long-term surveillance and are at risk indefinitely.
Descriptions of pathological features of corneal transplant rejection result from the examination of replaced grafts following irreversible failure. These specimens illustrate late changes in end-stage corneal opacification, usually months following rejection treatment. Characteristic findings in stroma are vascularisation with mononuclear cell infiltration and keratocyte loss; few if any endothelial cells remain. Several studies have shown increased numbers of HLA class II positive cells infiltrating stroma in sections of rejected grafts.
Epithelial rejection, diagnosed by a linear opacity stained with fluorescein, comprised up to 10% of all rejection episodes in one series and occurred on average 3 months after grafting. Although dead donor epithelial cells are rapidly replaced by recipient epithelial cells, and no scarring occurs, this type of rejection reflects that the recipient is now sensitized to the donor and can progress to stromal and/or endothelial rejection. Stromal rejection is characterized by nummular subepithelial infiltrates, identical to those found in adenovirus keratitis. Patients with both epithelial and stromal rejection may be asymptomatic or simply have mild ocular discomfort. In contrast, patients with endothelial rejection will usually present with visual disturbance and iritis symptoms. If examined early after rejection symptoms begin, anterior chamber cell infiltration without flare or graft abnormality will be seen. When symptoms begin later, the signs, in succession, are aggregated alloreactive cells adherent to graft endothelium evident as keratic precipitates, an endothelial line with precipitates, and localized edema corresponding to a rejection line or total graft edema. Visible graft precipitates on slit-lamp biomicroscopy imply focal and variable, but irreversible, endothelial cell loss, compromising endothelial pump function and resulting in stroma edema in those grafts with severe inflammation or low endothelial cell density before rejection onset. Pachymetry is helpful in detecting an increase in edema and also deturgescence following the start of steroid treatment. In one study it was found that next to the preoperative diagnosis, graft thickness during rejection, as objectively measured by pachymetry, is a prognostic sign for reversibility of a rejection episode. Risk factors for significant endothelial cell loss are a delay in initiating anti-rejection treatment more than 1 day and recipient age older than 60 years.
The objective of treatment is to reverse the rejection episode at the earliest possible time, to minimize donor endothelial cell loss, and preserve graft function. With the anatomical advantage that corneal transplants are superficial, intensive administration of a topical corticosteroid, such as dexamethasone 0.1%, treatment is successful in reversing most endothelial rejection episodes. In most cases in which topical steroid fails to reverse rejection, it is likely to be due to delay in recognition and initiation of treatment resulting in significant donor endothelial cell loss. In others, failure to reverse rejection may be due to the failure of topical steroid to reverse effector components of the allogeneic response. In respect of additional systemic steroid, a single dose of intravenous methylprednisolone was found to be more effective than oral steroid in patients with endothelial rejection who presented within 8 days of onset. A second pulse of intravenous methylprednisolone at 24 or 48 hours gave no benefit when compared to a single dose at initial diagnosis. However, a subsequent randomized trial demonstrated no significant benefit of intravenous methylprednisolone in addition to a topical steroid, in respect of graft survival or interval to a subsequent rejection episode within a 2-year follow-up period. In the same study, endothelial rejection was reversed in 33 of 36 patients treated, indicating that steroid-resistant rejection is uncommon. Other studies examining the efficacy of topical or oral cyclosporin administered in combination with intravenous steroid have reported similar outcomes, with irreversible rejection in a small proportion of patients.
In vascularised organ allotransplantation, there is robust evidence supporting HLA matching of donor and recipient, with the data of Opelz and others demonstrating stratification of the risk of rejection according to the number of class I and especially class II mismatches. HLA matching is routine, internationally, in cadaveric renal and other organ transplantation. Contrastingly, in corneal transplantation, in some countries, donor and recipient matching is routinely done for recipients who have a high risk of HLA class I and class II rejection, while in other countries no matching takes place. Roelen suggested a benefit for HLA-A and HLA-B matching of high-risk corneal allograft recipients was that primed, donor-specific cytotoxic T cells were present in rejected corneas but absent in donors with good graft function. However, the benefit of histocompatibility matching in corneal transplantation has been disputed and is less apparent than the benefit for solid organ grafts, even in corneal recipients at perceived high risk of graft rejection. Two large prospective studies on HLA-A, HLA-B, or HLA-DR antigen matching of high-risk recipients have reported divergent findings. The Collaborative Corneal Transplant Studies Research Group reported that matching of these antigens did not decrease the risk of corneal graft failure secondary to rejection. In contrast, the Corneal Transplant Follow-up Study found there was an increased risk of graft rejection with the mismatch of HLA class I antigens (relative risk 1.27 per mismatch), but decreasing risk of rejection with HLA-DR mismatches (relative risk 0.58 per mismatch) in high-risk patients. This study, therefore, supported matching at HLA-A, and HLA-B but not HLA-DR. The possible benefit of planned HLA-DR mismatching in a setting of known class I histocompatibility is being investigated in an ongoing prospective trial.
In 1996, a randomized, although retrospective study revealed the beneficial effect of DRB1 matching in recipients at high-risk because of vascularization and/or retransplantation. Subsequently, a beneficial effect of HLA-DPB1 matching in high-risk corneal transplantation with a significantly higher rate of 1-year, rejection-free, graft survival compared to those without matching was shown. Therefore, in corneal transplantation, the effect of HLA matching is less than clear, and the data are ambiguous for class II matching. Resolution of this clinically important issue is not simple. In contrast to solid organs, results of cornea matching likely to be influenced by the following facts:
It is also worth noting here that the effects of HLA matching on corneal graft outcome have not yet been investigated in the setting of systemic immunosuppression prophylaxis. Studies in solid organ transplantation have shown that more effective rejection prophylaxis can override an HLA-matching effect in unsensitized recipients.