Abstract
The scarcity of suitable organ donors leads to protracted waiting times and mortality in patients awaiting lung transplantation. This study aims to assess the short- and long-term effects of a high emergency organ allocation policy on the outcome of lung transplantation.
We developed a simulation model of lung transplantation waiting queues under two allocation strategies, based either on waiting time only or on additional criteria to prioritise the sickest patients. The model was informed by data from the United Network for Organ Sharing. We compared the impact of these strategies on waiting time, waiting list mortality and overall survival in various situations of organ scarcity.
The impact of a high emergency allocation strategy depends largely on the organ supply. When organ supply is sufficient (>95 organs per 100 patients), it may prevent a small number of early deaths (1 year survival: 93.7% against 92.4% for waiting time only) without significant impact on waiting times or long-term survival. When the organ/recipient ratio is lower, the benefits in early mortality are larger but are counterbalanced by a dramatic increase of the size of the waiting list. Consequently, we observed a progressive increase of mortality on the waiting list (although still lower than with waiting time only), a deterioration of patients’ condition at transplant and a decrease of post-transplant survival times.
High emergency organ allocation is an effective strategy to reduce mortality on the waiting list, but causes a disruption of the list equilibrium that may have detrimental long-term effects in situations of significant organ scarcity.
Abstract
High emergency organ allocation is effective to reduce waiting list mortality when organ supply is sufficient http://ow.ly/KD1930fPs8I
Introduction
Lung transplantation (LT) is the ultimate therapy available for patients with end-stage lung disease [1]. Despite continuous efforts to expand the pool of donors, the number of available organs is still insufficient to meet the growing demand, leading to deaths on the waiting list and protracted waiting times for those who ultimately undergo LT [2, 3]. Although organ shortage is common in the field of solid organ transplant, it is further aggravated in the case of LT by the very low rates of suitable donors due to the fragility of the lung and the conservative practices of most transplant centres [4].
Allocating this scarce resource in an equitable and efficient manner is complex. Health authorities must balance multiple and sometimes conflicting allocation goals [5, 6], leading to marked variation between countries. In 2005, the USA moved from a system based on waiting time to a system based on the expected benefit of LT, the lung allocation score (LAS) system [7, 8]. However, the LAS has been criticised on methodological grounds and is associated with a significant increase in resource use [9, 10]. In Europe, although two countries have adopted the LAS system (Germany and the Netherlands) [11–13], most countries rely on a system in which some patients are prioritised under a high emergency status while the remaining receive a graft based on time spent on the waiting list [14]. However, the criteria to enter the high emergency list vary markedly. For instance, patients with chronic obstructive pulmonary disease (COPD) may be granted a high emergency status in the UK, but not in France, where this status can be granted only to patients with idiopathic pulmonary fibrosis, cystic fibrosis or pulmonary arterial hypertension.
These lung transplant allocation policies have been designed empirically and are revised over time when limitations appear. Such limitations may become apparent only several years after implementation [15, 16]. For example, a high emergency LT system (HELTx) may decrease global death rates among patients awaiting LT in the short term, because the sickest patients are removed from the waiting list [17]. However, the long term impact of such a system is less clear. In France for instance, where a similar system was adopted in 2004 for heart transplants, the number of patients listed with emergency status is now approaching 50% and decreased post-transplantation survival rates are being observed [18].
Numerical simulations are very powerful tools for investigating the behaviour of complex systems subject to random variations [19–21]. Agent-based modelling is a computational method that enables experimenting with autonomous agents that evolve and interact within a defined environment. This method is particularly well suited to simulating a queuing system and hence to compare organ allocation policies. In this study, we used numerical simulations to assess the short- and long-term effects of the implementation of a high emergency rule for transplant allocation on the outcome of patients with end-stage lung diseases.
Methods
We developed an agent-based model of LT waiting queues using NetLogo [22]. Virtual patients (VPs) were individually created according to four diagnosis groups: (A) obstructive lung disease (e.g. emphysema); (B) pulmonary vascular disease (e.g. primary pulmonary hypertension); (C) cystic fibrosis or immunodeficiency disorder; and (D) restrictive lung disease (e.g. idiopathic pulmonary fibrosis). Each VP was granted the following characteristics: age at enrolment, blood type, body mass index, diabetes, New York Heart Association (NYHA) class IV, pulmonary forced vital capacity, 6-min walk test distance above 150 feet (∼46 m), O2 requirement at rest, use of continuous mechanical ventilation, serum creatinine, pulmonary artery pressure and pulmonary capillary wedge pressure (figure 1). These characteristics were sampled from realistic distributions based on first enrolment data from 8315 patients seeking LT during 2005–2010 from the United Network for Organ Sharing (UNOS) database. Changes in VP's quantitative characteristics over time replicated those of UNOS patients in the first 24 weeks following enrolment. Only the first 24 weeks were considered to account for informative censoring, whereby patients with more severe evolution would be more likely to be removed early from the waiting list. Changes in qualitative variables were not modelled except for continuous mechanical ventilation and NYHA class (see supplementary appendix).
VPs were followed from enrolment on the waiting list for LT until death. New VPs and organs arrived uniformly over time. We simulated a centralised system inspired by the one in France and considered two main scenarios concerning the annual number of VPs and organs: (1) a scenario where the yearly numbers of new receivers and arriving organs were chosen to mimic the French situation in 2012 (335 new receivers for 322 organs per year; i.e. 96 organs per 100 VP), named scenario R96 [18]; (2) an alternative scenario with fewer available organs (268 organs for 335 receivers, i.e. 80 organs per 100 VP), named scenario R80. Other intermediate scenarios with organ/receiver ratios ranging from 60% to 105% were also considered.
The simulation progressed as follows:
1) Upon enrolment, a time of survival without transplantation was drawn for each VP based on their characteristics using the equations from the LAS pre-transplant survival model [23].
2) Upon arrival, organs were allocated to an ABO-compatible VP, selected according to the operating rule of the waiting list. Two allocation rules were explored: the “first-in, first-out” rule (abbreviated FIFO), where organs were allocated to the patients who had been enlisted for the longest time and the “high emergency lung transplantation” rule (abbreviated HELTx), where organs were first tentatively matched to receivers enlisted in a high priority sub-list intended for patients in imminent danger of death without LT, then to other receivers according to waiting time. In accordance with the French rules, enrolment on the high-priority list only concerned patients from groups B, C and D. In theory, only patients whose death is expected in the very short term should be enlisted in the high-priority list. However, while the date of death could be drawn from the LAS equations, simulating whether and when the criticality of a patient's state will be recognised by the transplantation team and trigger enrolment on the high-priority list is less unequivocal and necessitates additional assumptions. Considering this imperfect identification, VPs were enlisted in the high-priority list 3 weeks before an estimated date of death, which reflects the beliefs of the transplantation team and was randomly drawn around the actual date of death. Following an expert's opinion, this procedure was parameterised so that 92% of patients with risk of imminent death were placed on the high-priority list. The robustness of the results to these assumptions was tested in a sensitivity analysis.
3) Upon LT, organs were removed from the organ pool and grafted VPs were removed from the waiting list. The post-transplantation survival was then computed from the VPs’ characteristics at this particular time using the equations from the LAS post-transplant survival model [23]. The yearly mortality rate was calculated with the number of VPs who did not receive a transplant before their actual date of death on a given year divided by the total number of VPs who were on the waiting list at any point of that year.
We averaged a large number of stochastic simulations for each scenario in order to keep true differences rather than chance variation. Each simulation of the waiting list was run for 15 years, after 4 years of warm-up under the FIFO rule to avoid transient effects due to waiting list build-up. We reported summary characteristics as medians with 5th and 95th percentiles (i.e. 90% central range). Overall survival times were compared using the Kaplan–Meier method. A systematic sensitivity analysis was carried out by varying parameters over a wide range using Latin hypercube sampling. The software used was NetLogo for simulation and R 3.2.2 with the packages survival, RNetLogo and ggplot2 for statistical analyses [22, 24–26]. The code is available from www.runmycode.org. Further details are available in the supplementary appendix.
Results
VPs’ characteristics
We performed one hundred 15-year simulations for each of the two scenarios of organ scarcity for a total of over two million VPs. Simulated VPs’ characteristics reproduced very closely those of patients in the UNOS registry at first enrolment and during follow-up (figure 1). After calibration of the baseline survival in the LAS score, the median expected survival without LT was 2.0 years (90% central range (CR) 0.2–10.8 years) and the median post-transplantation survival was 5.6 years (90% CR 1.1–14.2 years). Survival times varied according to the diagnosis group, with longer pre-transplantation survival for VPs from group A and longer post-transplantation survival for VPs from groups A and C (figure 2a).
Waiting list dynamics and short-term performance
Regardless of the allocation strategy, the overall short-term performance as measured by mortality on the waiting list and waiting times worsened as fewer organs were available. For instance, when there were 96 available organs for 100 VPs (scenario R96), the median waiting time before LT was 1.6 months (90% CR 0–5.8 months) and 3.6% of VPs died while on the waiting list (figure 2b and c). However, if the number of available organs dropped to 80 for 100 VPs (scenario R80), the median waiting time before LT increased to 8.3 months (90% CR 2.1–19.4 months) and the mortality on the waiting list increased to 16.6%.
The operating allocation rule had a sizeable impact on VPs’ outcomes in situations where the number of available organs was identical. The overall mortality on the waiting list was lower with the HELTx rule than with the FIFO rule (3.2% versus 3.9% for scenario R96 and 14.8% versus 18.4% for scenario R80; table 1). Over the 15 years of simulation, the yearly mortality rate remained high and relatively stable with the FIFO rule relatively to the HELTx rule. However, with the latter, on-list mortality increased over time to reach only slightly lower levels than the FIFO rule: from 2.0% to 4.4% in the R96 scenario and from 8.7% to 21.0% in the R80 scenario (figure 3a).
The HELTx strategy was, however, associated with a major and quick rise in the waiting times until LT for the overall cohort of patients when compared to FIFO (from 0.9 months to 2.5 months for scenario R96 and from 5.8 months to 16.3 months for scenario R80; figure 3b). This increase mirrored the progressive accumulation of patients on the waiting list. Considering additional scenarios to R96 and R80 (with a range of 60–105 available organs per 100 VPs) showed that this accumulation under HELTx occurs as soon as the number of available organs per 100 VPs falls below 95 (figure 3c). Comparatively, list sizes and time until graft were relatively stable in the FIFO strategy.
Consequences on long-term survival
As the HELTx strategy prioritises LT in the most critically ill patients who may have poorer post-transplantation outcomes, the potential harmful consequences on the overall long-term survival of the whole population should be considered. Indeed, the overall proportion of LTs performed on patients from the high-priority list increased between the first and fifteenth years of simulation from 4.0% to 9.1% for scenario R96 and from 11.6% to 32.5% for scenario R80 (figure 4a). Being enlisted on the high-priority list effectively increased the probability of receiving a transplant to approximately 70%, regardless of the scenario of organ scarcity. This probability must be compared to the probability of receiving a transplant for VPs in similar situations of imminent death, but with the FIFO rule active: 28% for scenario R96 and 7.5% for scenario R80. As a consequence, LT was performed on VPs in lower condition, which in turn led to slightly worse post-transplantation outcomes (figure 4b and c).
Regarding overall survival (i.e. time from enrolment on the waiting list to death with or without LT), HELTx led to a significant amelioration in the first few years following enrolment (1 year survival with HELTx compared to FIFO: 93.7% versus 92.4% in scenario R96 and 86.6% versus 80.4% in scenario R80, figure 5a,b). This difference was the largest in VPs in diagnosis group D, followed by B and C (figure 5c–e). On the contrary, VPs in diagnosis group A (who have no access to the high-priority list) had a moderately lower early survival under the HELTx operating rule. This early advantage of HELTx did not have any significant long-term consequence on survival and, from approximately 5 years after enrolment, the survival rates were similar with both operating rules.
Sensitivity analysis
Sensitivity analyses were conducted to verify the robustness of the results. A wide range of parameter values defining survival as well as organ/recipient ratios were tested using Latin hypercube sampling. Essentially, the same results were obtained in all situations, with more noticeable differences between allocation strategies as the organ/recipient ratio decreased. In particular, there was no situation in which early survival after listing was the largest with a FIFO operating rule (supplementary appendix).
Discussion
After the performance of the first successful LT in the early 1980s, the number of procedures performed worldwide increased rapidly [1, 27]. Organ allocation policies had to account for a continuously growing pool of LT candidates, while the availability of transplantable donor lungs remained at a level insufficient to meet demand. In some countries like the USA, grafts were allocated to patients according to the time spent on the waiting list, whereas in other countries, like France, the organs were allocated to centres in which physicians were in charge of allocating the graft to the most suitable patient, based on clinical judgement. One of the main downsides of these policies was the inability to accommodate the sickest patients, who faced high mortality, on the waiting lists. In the USA, for instance, mortality on the waiting list ranged from 13.8% for COPD patients to 33% for patients with idiopathic pulmonary fibrosis before implementation of the LAS system in 2005 [8]. Over the years, most countries have moved towards organ allocation policies based either on the expected survival benefit of LT or on medical urgency. In Europe, most countries have adopted a system in which the grafts are allocated first to the sickest patients, then either to patients according to their waiting time or to centres, which in turn decide on the recipient. Depending on the country, the criteria to be granted a high emergency status are based either on clearly defined criteria or on expert decision. Of note, the criteria to qualify for a high emergency status vary from country to country. In France, for instance, only patients with a diagnosis of cystic fibrosis, pulmonary arterial hypertension or idiopathic pulmonary fibrosis (but not COPD) may qualify for this status, whereas COPD patients may be granted a high emergency status in the UK.
Anticipating the impact of the implementation of a given organ allocation policy is complex as it may depend on parameters that can evolve rapidly over time (e.g. organ offer and demand). To complicate this issue even further, a given organ allocation strategy may have different short- and long-term impacts. For instance, the early drop in waiting list mortality observed in the first years following implementation of a high emergency organ allocation rule in heart transplantation in France was followed by a dramatic increase in the proportion of patients receiving a transplant under a high emergency status (up to 50%), with decreased post-transplant survival [18].
In this study, we used simulation tools to assess the performance of an organ allocation policy based on HELTx. We designed a flexible framework in which VPs were created with realistic characteristics that evolved with time and were followed from their first enrolment to the waiting list for LT until their eventual death. Our simulations show that the impact of HELTx depends largely on the organ supply. When the organ supply is sufficient (organ/recipient ≥95%), the implementation of a high emergency organ allocation strategy may prevent a small number of deaths without significant impact on waiting times or on overall long-term survival. When the organ/recipient ratio is lower, the benefit in terms of waiting list mortality seen in the first years following implementation of the HELTx policy vanishes over time, with dramatic waiting list inflation and increased waiting times [14]. Indeed with the HELTx rule, the most critically ill patients are scheduled first for transplantation. These patients, who probably would have died on the waiting list, are allocated organs that would otherwise have gone to other patients. As a consequence, other patients accumulate on the waiting list because their survival allows for a longer wait. This increase will be all the more noticeable when the organ supply is below 95 organs per 100 patients. Furthermore, HELTx in the context of a low organ supply also leads to significant changes in the indications of LT, with a growing number of patients with cystic fibrosis, pulmonary arterial hypertension and idiopathic pulmonary fibrosis receiving a graft over time at the expense of COPD recipients. This imbalance is further aggravated in France by the decision not to grant a high emergency status to COPD patients.
Several studies have used numerical simulations to test the impact of different organ allocation strategies in the field of LT. Using such an approach, Munson et al. [20] reached the conclusion that the use of single LT in patients with COPD improved access to organs without significant reduction in post transplant survival in the USA. Similarly, the Scientific Registry of Transplant Recipients (SRTR) has developed simulated allocation models to replicate the allocation of organs and the outcomes of candidates both on the waiting list and after transplantation [28]. One of these models, the Thoracic Simulated Allocation Model (TSAM), has been specifically designed for the allocation of thoracic organs. Although this model has been used when designing the LAS system [8] and more recently to assess the impact of broader geographic sharing of paediatric donor lungs [29], we are unaware of any scientific publication precisely describing the simulation framework and the validity of the results.
This study has several limitations. An accurate quantitative assessment of operating rules for LT waiting lists is obviously dependent on the models used to predict survival before and after transplantation. Here, we adopted the models used in the LAS scoring system. In this approach, a baseline survival function is modified by individual characteristics to predict effective survival. We extrapolated the baseline survival function, which is described only for the first year in the original scoring system, to allow the simulation of long-term survival. To allow for uncertainty in this extrapolation, we used several parameter combinations in sensitivity analyses and found no evidence of a dependency of our results on the specific set of chosen values for parameters. It is well known that the prediction of post-transplant survival is difficult and that the LAS equations for post-transplant survival have poor discrimination [30]. Moreover, because the LAS equations were used in simulating survival in our framework, including the LAS-based allocation system in our assessment would provide an unfair comparison with HELTx. Independent survival equations would be necessary to provide a framework for these comparisons. Although this limits the scope of this work, our conclusions are of interest for the countries considering or having recently implemented high emergency allocation rules, as they highlight the conditions that may lead to detrimental waiting list dynamics. Another limitation is the inability to anticipate modifications in physician behaviour to accommodate a given organ allocation policy. It is well known, for instance, that before LAS implementation in the USA, listing and transplantation practices changed over time to overcome the inability to access organs rapidly. At that time, many centres listed patients at a very early stage of their disease to accrue time on the waiting list, leading to a dramatic increase in the size of the waiting list. These behaviours are difficult to anticipate with numerical simulations. The use of UNOS data (i.e. US patients) may limit transposition of the results to other areas. We indeed assumed that the course of the disease was similar in these patients and others, for want of detailed data for French or European patients. Furthermore, the use of UNOS data was limited to model the initial characteristics of patients, as well as the change in quantitative characteristics with time. Although the relative importance of patient categories may change in European countries, the effect should be limited in the comparison of the two allocation rules, as the VPs were modelled identically in both groups. Finally, we did not consider some measures that could be implemented to cope with the shortcomings of the HELTx policy, such as a limitation of the total number of patients per centre allowed to receive a graft under this status. Other strategies could be envisioned such as denying access to LT to patients with low expected post-transplant survival.
In conclusion, our simulations suggest that HELTx may be a reasonable organ allocation policy when the organ supply is high. However, the use of such an organ allocation policy has several detrimental impacts that may be revealed only several years after its implementation, especially when the organ/recipient ratio is lower than 95%. Further studies are needed to test whether some improvements could be made to mitigate the flaws in this organ allocation policy and whether alternative organ allocation policies (like the LAS system) have better properties.
Supplementary material
Supplementary Material
Please note: supplementary material is not edited by the Editorial Office, and is uploaded as it has been supplied by the author.
Supplementary material 00020-2017_supp
Disclosures
Supplementary Material
J.D. Christie 00020-2017_Christie
Acknowledgements
Authorship: PYB and GT were responsible for the design of this study. JR and PYB conceived the model, performed the simulations, and analysed the data. JR, PYB, JDC, and GT were involved in drafting and revising the manuscript.
Footnotes
This article has supplementary material available from openres.erjournals.com
Conflict of interest: Disclosures can be found alongside this article at openres.ersjournals.com
Support statement: This work has been funded by a grant from Vaincre la mucoviscidose. Funding information for this article has been deposited with the Crossref Funder Registry.
- Received February 17, 2017.
- Accepted September 2, 2017.
- Copyright ©ERS 2017
This article is open access and distributed under the terms of the Creative Commons Attribution Non-Commercial Licence 4.0.