The objectives of the ESP were to achieve a more efficient use of kidneys from elderly donors and to reduce the time that elderly patients wait for a kidney transplant. Overall, this 5 year analysis of the ESP shows that the aims of the program (see also 2.1.) were met. The availability of elderly donors increased from 169 (10% of all donors) in 1998 to 239 (almost 15% of all donors) in 2004. Simultaneously, the number of patients transplanted in the ESP increased from 227 in 1999 to 382 in 2003. The waiting time for elderly recipients transplanted within the ESP was reduced compared to the waiting time before introduction of ESP (Smits, et al., 2002). The cold ischemia time for ESP patients was significantly shorter with a mean of approximately 12 hours compared with over 17 hours in both control groups. Bearing in mind that the control groups differed in either donor or recipient age from the ESP groups, the main clinical outcomes in recipients of organs from donors age 65 or older were not negatively impacted by the ESP allocation. This will be discussed in more detail below.
Germany, Holland and Belgium were the largest contributors to the ESP and the two control groups, which compares well to the overall split of renal transplants performed within the Eurotransplant region (Doxiadis, et al., 2004). When considering the distribution of donor age, and because of its importance for interpreting outcome, it is worth highlighting that 50% of the donors in the ESP were above 70 years of age compared to only 3.7% of donors in Control 2. With an age of 67.7 years the average ESP recipient was 10 years older than a recipient in Control 1 and 4 years older than in Control 2. Interestingly, there were significantly more female donors in the ESP group (54.2%) compared to Control 2 (41.6%) while at the same time there were significantly more male recipients in the ESP group (64.8%) compared to Control 2 (60%). Differences in graft survival according to donor gender have been reported by Zeier et al. Death-censored actuarial renal allograft survival from female compared with male donors was worse in female recipients, and worse still in male recipients. The donor gender–associated risk ratio for graft loss was 1.15 in female recipients and 1.22 in male recipients (Zeier, et al., 2002). According to this report, the gender distribution in the ESP group might have been less favourable than in the other two groups. The incidence of diabetes and hypertension recorded in the medical history of the donor was highest in the ESP group, also reflecting a potentially less favourable starting point for the ESP group. As expected from a much younger donor population the percentage of traumatic cause of donor death in the Control group 2 (any to old) was almost twice as high as in both ESP and Control 1. Eurotransplant evaluated the effect of different preservation solutions on the initial graft function and long term graft survival in two prospective randomized studies. They concluded that HTK is comparable to UW in its preservative abilities, whereas EC should be avoided (de Boer, et al., 1999). In all the groups, UW was the most commonly used preservation solution (62.2%), followed by HTK/Bretschneider (34.4%). In Control 2 UW was used even more frequently (66.9%) while HTK/Bretschneider was used more often in the ESP group (38.9%). Cox regression analyses performed in this 5 year analysis identified the use of a preservation solution other than UW and HTK/Bretschneider as an independent risk factor for patient survival that almost doubled the risk of death and was also associated with a 50% increased risk of rejection in the model for ESP and Control 2.
Prior to the ESP program, the waiting time for the older patients was significantly longer than for other age groups (Smits, et al., 2002). In the first year analysis the ESP group had already been shown to benefit as the median time on the waitinglist was now comparable at 3.94 years for ESP; 3.61 for Control 1 (old to any) and 3.89 for Control 2 (any to old). By the time of the five year analysis the waiting time for the ESP group had shortened further by 5 months, while the waiting time in Control 1 increased by 2.4 months and for Control 2 by one year. This trend for patients transplanted via ETKAS is in line with published reports regarding the increase in waiting time over recent years with over 21% waiting more than 5 years (Doxiadis, et al., 2004). The median waiting times for waitlisted patients older than 65 based on the OPTN data as of July 8, 2005 was comparable to the ESP patients with 3.67 years (Source: www.optn.org). 97% of the organs in the ESP program were, as expected, allocated locally or regionally, as compared to only 50% of the organs transplanted via ETKAS. This is about 10% less than the 61% locally or regionally transplanted organs in the ETKAS group for all ages since 1996 (Doxiadis, et al., 2004). However, one should bear in mind that ESP patients were included in ETKAS from 2001.
As Control 1 and 2 kidney transplants were allocated via an HLA-driven system, the median number of HLA-A,-B,-DR mismatches was significantly lower compared to the age-matched kidney transplants in the ESP group, 3 and 2 vs. 4, respectively. Not surprisingly, the number of class I (HLA-A&B) and class II (HLA-DR) HLA-mismatches was significantly higher for the ESP group as well. 99.7% of ESP patients had at least 1 class I and 92.9% at least 1 class II mismatch. In comparison, within the whole of ETKAS a steady decrease in the number of 0-2 mismatch transplants and an increase in the number of 3-6 mismatch transplants is observed (Doxiadis, et al., 2004). The trade-off between immunological and non-immunological risk factors was taken into account before implementation of the ESP (Smits, et al., 2000). It had been postulated that the HLA matching effect in kidney transplantation is decreasing in donors above 40 years of age (Cicciarelli, et al., 1999). Furthermore, as HLA compatibility was disregarded in ESP, the program was restricted to non-immunized (PRA <5%) recipients who were awaiting their first transplant. In our ESP analysis population, 22 out of 1405 patients (1.6%) were highly sensitized, the remaining 98.4% complied with the ESP rule. In both of the control groups the number of highly sensitized patients was significantly higher at approximately 10% which is comparable to the overall ET population (Doxiadis, et al., 2004). Also, a prerequisite of the program was to reduce the cold-ischemia time as far as possible by allocating organs locally to minimize the accumulation of risk factors and hence improve the outcome in these recipients. Although median cold-ischemic times were significantly lower in the ESP group compared to the controls (p<0.001), only 50% of the ESP transplants had a cold ischemia time of < 12h and only 26% < 8 hours. Compared with the first year analysis, it became apparent that the cold ischemia time for all three groups had been successfully reduced, and differences between groups had become smaller. Reduction of non-immunologic damage by ensuring a short CIT is considered important by several groups to counterbalance possible immunologic effects on graft function, especially in the old-for-old setting (Klehr, et al., 1996;Lee, et al., 2000;Preuschof, et al., 1991;Shoskes and Cecka, 1997). A single centre ESP publication (Giessing, et al., 2003) reported very short CIT in the ESP group (8.3 hours) compared to other ESP reports (Smits: 12 hours (Smits, et al., 2002), Schlieper: 13.3 hours (Schlieper, et al., 2001), Beckurts: 9.5 hours (Beckurts, et al., 2001)). The author strongly believes that the good graft function (only 12 % DGF compared to 29.7% in the ESP group as a whole) and graft survival observed were driven to a great extent by this reduction of CIT. On the other hand, Opelz data based on the CTS registry suggests that a very short CIT (< 6 hours) may not be advantageous, and that only a CIT > 24 hours has a negative impact on graft survival. He showed that HLA matching had a highly significant impact even when the analysis was restricted to patients with an ischemia time between 0-12 hours (Opelz, 2002). In this analysis, despite many of the baseline characteristics of the ESP group being less favourable, these appear to have been successfully balanced by the ESP program, at least in the short term, as evidenced by the fact early graft function was as good as Control 2 and significantly better than Control 1.
The ESP (old to old) group had the lowest 1 and 5 year patient survival rates with 86% and 60% compared to 88% and 71% for Control 1(old to any) and 90% and 74% for Control 2 (any to old),as determined by Kaplan Meier Analysis, respectively. The OPTN database reported an unadjusted 5-year survival of 65.2% in recipients older than 65 years of age and 78% in recipients age 50-54 (Source: www.optn.org ). Meier Kriesche reported a 1-year survival of 81% in recipients aged 60 and above (Meier-Kriesche, et al., 1999). Fritsche and Arbogast reported similar 1 year patient survival for a subgroup of the ESP patients while a nation wide analysis from Israel reported a lower 1 year survival rate of only 54% in the old to old population (Arbogast, et al., 2005;Fritsche, et al., 2003;Weiss-Salz, et al., 2005). The results for Control 1 are, as expected, in line with what has been published by Eurotransplant, and survival rates for Control 2 are also comparable to what was published by Fritsche at al. (Fritsche, et al., 2003;Smits, et al., 2002). Differences between groups could be explained by risk factors such as recipient gender, delayed graft function, donor age, graft loss, recipient diabetes, a preservation solution other than UW and HTK/Bretschneider and cardiovascular disease in the medical history of the recipient. Interestingly, DGF increased the risk of death by 40% and use of a preservation solution other than UW and HTK/Bretschneider almost doubled the risk of death in one of the analyses. This finding is certainly interesting, but it is supported by a relatively small number of cases in which none of the two main solutions was used. Further evaluations should be considered before a specific recommendation can be given regarding the avoidance of specific preservation solutions. Maintaining short ischemia times or even further reducing them seems to be important in order to decrease the incidence of DGF further. Advocating the use of UW for preservation should also be considered. Of great importance is also the result of the sub-analyses for Control 2 split in donors age < and ≥ 60 years of age and the subsequent model extrapolated to simulate the survival for Control 2 with a donor and recipient age comparable to the ESP group. These analyses were performed to account for the differences in both donor as well as recipient age between ESP and Control 2. Both analyses showed that the patient survival for the Control 2d≥60 as well as for the Control 2 extrapolated to ESP parameters was not different form the survival of the ESP group, strongly suggesting that the age of the donor is the main variable driving differences in survival between ESP and Control 2.
Uncensored graft survival in ESP (75% at 1 year and 47% at 5 years post- transplant) was comparable to Control 1 (74% at 1 year and 51% at 5 years post-transplant) but significantly less than Control 2 (83% at 1 year and 64% at 5 years post- transplant). However, the difference in graft and patient survival between ESP and Control 2 disappeared if Control 2 was restricted to donor age ≥ 60. This demonstrates the impact of donor age on long term outcome and suggests the ESP concept was successful in optimizing the outcome when compared to a similar population transplanted via ETKAS. The fact that Control 1 showed a similar graft and patient survival compared to the ESP group, despite the average age of the recipient being 10 years younger, might suggest that an old organ transplanted into a younger recipient might actually negatively impact the outcome of the younger recipients. This finding is in line with results published by Waiser who found the “old to young” transplants to have the poorest graft survival (approximately 50% at 5 years, see also page 22). Cox regression models as well as analysis of Control 2 d ≥ 60 and extrapolated Control 2 showed results similar to those for the survival analysis with DGF and male gender of the recipient being strong independent risk factors and no significant differences when restricting Control 2 to a more similar donor age.
Death censored graft survival in ESP was not different from Control 1 (1 year: 83% vs. 81% and 5 year: 67% for both), but significantly different from Control 2 (1 year 90% and 5 year 81%). The one year graft survival rates reported by Fritsche for a subpopulation of the ESP and Control 2 are almost identical; however their results were not statistically significant (Fritsche, et al., 2003). The increased size and duration of follow-up in this evaluation is the most likely explanation for the difference becoming significant. The most important insight from this analysis is that the old donor kidneys did not survive longer in a younger recipient (Control 1 old to any) despite the fact they had less HLA mismatches and fewer rejections. Hariharan and colleagues suggested in 1997 that older donor kidneys have a better graft survival when transplanted into older recipients compared to younger recipients (Hariharan, et al., 1997). The fact that kidneys from younger donors survive longer is not surprising. In addition, differences between death censored graft survival in ESP and Control 2 could be explained by differences in recipient gender and DGF, but interestingly also by HLA class I mismatches. For every mismatch in class I antigens, the risk of graft loss increased by 15%.
Results published by Waiser et al. showed that graft survival of kidneys from old donors was significantly reduced in young recipients compared to oldones. Graft loss in the “old to young” group was mainly due to acute and chronic rejection (Waiser, et al., 2000). Interestingly, similar results were found in this analysis with patients in Control 1 (old to any) having lost more grafts due to rejection (42.4%) compared to ESP (old to old 29.5%) or Control 2 (any to young, 28.8%). A cumulative effect, as a result of damage due to ischemia/reperfusion and the reduced capacity of kidneys from older donors to respond to physiological and pathological stresses might explain this phenomenon (see also 220.127.116.11). While Meier-Kriesche suggested that donor age represents a significant risk factor for patient death with a functioning graft - speculating that worse clearance of the aged graft translates into hypertension in a younger recipient or represents an intrinsic a risk factor for patient survival (Meier-Kriesche, et al., 2002) - the current analysis did not show a significant difference between the three groups. Overall, of the total numberof patients who died, 76.6% had functioning grafts.It is worth mentioning that the definition and selection of marginal donors in the US and Europe are quite different. In the US expanded donor criteria (ECD) are defined as all donors older than 60 years and donors older than 50 years with any 2 of the following criteria: (a) hypertension; (b) cerebrovascular cause of brain death; or (c) donor SCr > 1.5 mg/dl (Stratta, et al., 2004), while no clear definition exists in Europe (Tullius, et al., 2001). However, published data, as well as this report suggests that the quality of organs accepted for transplantation is much lower in Europe as compared to that in the US. This needs to be taken into account when comparing results from Europe and the US.
Occurrence of serious opportunistic infections at any time post transplant was very common and highest in ESP (51%) and Control 1 (50,4%) patients compared to 38,8% in Control 2. Overall, cardiovascular events (defined as MI, bypass grafting, stroke or amputation) and malignancies at any time post transplant were reported in 14,3% and 9,5% of all patients, again with the highest incidence in the ESP group (15,2% and 10,3 % respectively). Almost 60% of all patients who died did so as a result of an infectious or cardiovascular event. Infectious death occurred in approximately 30% in all three groups
(no significant difference). Interestingly death due to cardiovascular events occurred slightly less often in the ESP group (22,9%) as compared to Control 1 (32,4%) and 2 (32,5%). In 2001 Meier-Kriesche showed both deaths related to infections but also mortality secondary to cardiovascular disease increases progressively with increasing age of the recipient (Meier-Kriesche, et al., 2001). The finding of equal percentages of patients with infectious death in all three groups, even in Control 1 with a much younger average age of the recipient, is somewhat surprising and suggests a potential impact of the older donor organ (Martins, et al., 2005). The higher incidence of impaired graft function or higher levels of imunosuppression associated with a higher acute rejection rate might help explain this finding.
As mentioned previously a success of the local allocation and the shorter ischemia time is reflected in the significantly higher percentage of initial function, the reduced rate of DGF and the lower SCr values in the ESP group compared to Control 1. Figure 17 impressively shows that the ESP pattern for early function was almost identical to Control 2 where the donor organs were on average 20 years younger. Delayed graft function was seen in 29,7%; 36,2% and 30,9% of the ESP, Control 1 and Control 2 transplants respectively. The ESP single centre analysis by Giessing reported a DGF rate of only 12% as opposed to a significantly higher incidence in the controls (43%). Voiculesco (Voiculescu, et al., 2002), whose ESP study group had a CIT of more than 12 hours, reported delayed graft function in 64% of patients. The first year data published by Smits (Smits, et al., 2002) showed a 33% incidence of DGF with a mean CIT of 12 hours. With DGF being an independent risk factor in almost all our models for patient survival, censored and uncensored graft survival and acute rejection one argument in favour of the ESP is reconfirmed.
The preliminary analyses of the ESP that were published prior to this update already reported a higher rate of acute rejections in the ESP group (around 40%; (Giessing, et al., 2003;Schlieper, et al., 2001;Smits, et al., 2002;Voiculescu, et al., 2002) compared to Control 1 (30% (Smits, et al., 2002)) and Control 2 (27.4% (Fritsche, et al., 2003)). In general the acute rejection rates for all groups reported in the 5 year database appear to be lower than expected from the initial reports. One reason could be that acute rejection was under reported. However, the ESP group still had significantly higher rates of acute rejection, biopsy proven acute rejection, early and late rejection despite the fact that more than half of the patients received antibody induction therapy and 84% of the ESP patients were maintained on triple immunosuppression, highlighting the immunologic capacity even of aged recipients. Experimental data in a rat model support an enhanced cellular response in elderly recipients leading to accelerated chronic graft rejection (Pascher, et al., 2003). The high number of immunologic responses raises concerns about how effectively a shorter CIT counterbalances HLA mismatching in the ESP setting (Giessing, et al., 2004). However, long term outcome did not seem to be negatively affected and in fact, a much higher incidence of graft loss due to rejection was found in Control 1.
Cox regression analysis partially explained the differences in acute rejection rates between ESP and Control 2. Class I and II HLA mismatches were identified as significant independent risk factors increasing the risk of acute rejection by 32% and 26%. Patients with DGF had a 57% increased risk. One could speculate if the risk for acute rejection associated with HLA mismatching could be overcome by using more intense immunosuppression, or at least a higher initial immunosuppression (de Fijter, 2005;(Reutzel-Selke, et al., 2005). However, this must be balanced by the finding that the incidence of infectious complications and death secondary to infection is already very high in the ESP group. Another option would be to try and achieve better HLA matching while maintaining the same CIT. This could be achieved if allocation remained restricted to local recipients but HLA matching was used instead of waiting time as an allocation criterion (Fritsche, et al., 2003).
Although statistically significant, the difference of 2 days in the median number of in-hospital days for transplantation for ESP and Control 1 (27 days) compared to Control 2 (25 days) does not seem to be clinically relevant. However, when looking at readmissions and length of hospital stay during any readmission, the ESP patients appeared to experience more complications and require slightly longer hospital care than the control groups. The fact that the clinical condition at the most recent visit was judged by the treating physician as poor for 20,1% of ESP patients compared to 12,9% of patients in Control 1 and 10,1% of patients in Control 2 also indicated that ESP patients faced more problems than patients in the control groups.
Isolated from any ethical concerns, an elderly patient awaiting renal transplantation might be biased towards wanting to wait for an organ from a younger donor. However, the benefit of transplantation for these patients has to be looked at in comparison to risk of death on the waiting list. Wolfe et al. determined that renal transplantation doubles the life expectancy of a patient on dialysis listed for transplantation in the United States. Among patients who were 60-74 years old, the cumulative survival rate improved after the first year, with a projected increased life span of five years and a decrease in the long term risk of death of 61 percent (Wolfe, et al., 1999). In amore recent analysis from Europe, Oniscu and colleagues confirmed that despite an initial higher risk of death, long-term survival for patients who undergo transplantation is significantly better compared with patients who are listed but remain on dialysis. A successful transplant triples the life expectancy of a listed renal failure patient. In patients aged 65 years or older, transplantation lead to a twice-longer life expectancy compared with dialysis, with this proportional increase being greater than that noted in patients aged 18 to 34 years (Oniscu, et al., 2005).
|© Die inhaltliche Zusammenstellung und Aufmachung dieser Publikation sowie die elektronische Verarbeitung sind urheberrechtlich geschützt. Jede Verwertung, die nicht ausdrücklich vom Urheberrechtsgesetz zugelassen ist, bedarf der vorherigen Zustimmung. Das gilt insbesondere für die Vervielfältigung, die Bearbeitung und Einspeicherung und Verarbeitung in elektronische Systeme.|
|DiML DTD Version 4.0||Zertifizierter Dokumentenserver|
der Humboldt-Universität zu Berlin
|HTML-Version erstellt am: