11-11-2016, 10:59 AM
1468418569-janu.docx (Size: 184.44 KB / Downloads: 10)
Summary
Organ transplantation has developed over the past 50 years to reach the sophisticated and integrated clinical service of today through several advances in science. One of the most important of these has been the ability to apply organ pres-ervation protocols to deliver donor organs of high quality, via a network of organ exchange to match the most suitable recipient patient to the best available organ, capable of rapid resumption of life-sustaining function in the recipient patient. This has only been possible by amassing a good understanding of the potential effects of hypoxic injury on donated organs, and how to prevent these by applying organ preservation. This review sets out the history of organ preservation, how applications of hypothermia have become central to the process, and what the current status is for the range of solid organs commonly transplanted. The science of organ preservation is constantly being updated with new knowledge and ideas, and the review also dis-cusses what innovations are coming close to clinical reality to meet the growing demands for high quality organs in transplantation over the next few years.
Introduction
Organ transplantation has been one of the most significant advances in medicine in the latter half of the 20th century and remains in many cases the only effective therapy for end-stage organ failure. The supply of organs of high quality and effi-cacy has always been one step of extreme importance in the overall multi-disciplinary approach to transplantation, and was identified as such early on in service development. Organ preservation has been described as ‘the supply line for organ transplantation’ [1] and in a logistical sense, ‘preservation’ buys ‘time’, which is essential to organise staff and facilities, transport organs, and perform necessary laboratory tests. The ability to use the powerful effects of cooling to slow biological deterioration in organs removed from their normal physiolog-ical environment has permitted the development of modern transplant services. However, low temperatures have a multi-tude of effects (both supportive and destructive) on mamma-lian cell biology which have required careful investigation to permit us to apply cold during organ preservation [2], and this has been the history of the development of techniques which are currently in use world-wide. In this review, we set out what we have learnt during the past 40 years since organ pres-ervation was first applied in the clinic, the basis of the current practices, and the challenges and opportunities for the future as demands for organ transplantation increase.
History of Organ Preservation for Transplantation
By the start of clinical transplantation in the 1960s, there was already a significant knowledge base from physiologists and anatomists about requirements to keep organs viable and functioning outside the body. The development of reliable perfusion pumps allowed the use of artificial perfusion circuits to study the physiology of organ function [3]. These kinds of studies were further facilitated by replacement of blood by synthetic perfusates – solutions of electrolytes, solutes, and vitamins [4, 5]. At the same time, use of lower temperatures to alter metabolism during perfusion provided an additional powerful tool [6]. Specific information on use of low tempera-tures to avoid organ damage during surgery was also gradu-ally being amassed [7]. The team led by Calne performed one of the earliest investigations into the relative merits of cooling kidneys by simple surface cooling or by cooling via the whole vascular bed with perfusion of the renal artery with cooled heparinised blood [8]. These authors concluded that vascular flush perfusion was the more efficient method of cooling the organ mass. However, use of chilled heparinised or diluted blood still caused many problems such as vascular stasis on re-implantation of the graft. Therefore, it was recognised that successful preservation would require the development of new, acellular solutions, which ushered in the first era of organ preservation research.
The pioneering work by Collins and his colleagues [9] led to the design of an acellular solution which mimicked, in a simple fashion, the intracellular electrolyte balance of the mammalian cells, and was the first notable attempt to advance organ preservation based on an understanding of the ongoing changes in cells during cooling. The resultant solutions (termed Collins solutions) provided were a significant im-provement in reliable preservation and were used routinely in transplant centres across the world for almost two decades. These solutions permitted successful renal preservation for 24–36 h, which was long enough to allow tissue matching and sharing of organs between transplant centres. Alongside these approaches, the proponents of continuous hypothermic per-fusion continued to develop their methods for oxygenated low temperature, low pressure perfusion [10, 11]. However, due to logistical considerations and reliability of the perfusion equip-ment available at that time, static flush cooling followed by ice storage became the most widely used preservation method. With the expansion of transplantation as a clinical therapy from the 1980s onwards, methods of multiple organ procure-ment and preservation were necessitated. In this approach, kidneys, liver, pancreas, heart, and lungs or various combina-tions of these organs could be removed without jeopardizing any of the individual organs. This led to the adoption of so-called ‘flexible techniques’ for organ procurement [12] in which all donor organs to be procured are cooled in situ, rapidly and simultaneously, removed in a bloodless field, and further dissected and packaged on a back table.
Over recent years, the continuing shortage of donor organs for transplantation and change in transplant donor demo-graphics have maintained the pressure to achieve the optimal and efficacious approach to organ preservation, which is the driver for ongoing research into organ preservation.
Use of Cooling and What Cooling Does
The stages in transplantation, which range from retrieving the organ from the donor, preserving it (usually by cooling in spe-cific solution) throughout to its implantation in the recipient (re-perfusion), are accompanied by a myriad of changes, which are complex and multi-factorial (molecular, cell and tis-sue changes, produced during the required anoxia period of the surgery and for the duration of the cold storage), and they are manifested in the transplant. The physiopathological processes responsible for transplant injuries are defined as ischaemia/re-perfusion injury (IRI) in organ transplantation. Although this fundamental requirement, the chilling of or-gans, has harmful repercussions on the tissues due to oxida-tive stress (production of reactive oxygen species) and inflam-mation (cytokine production), which are probably responsible for the exacerbation and, above all, persistence of this con-dition, it is one of the most widespread methods to preserve organs for transplant. Significant structural changes in the
cytoskeleton result in dislocation of the endothelial cells. However, one thing is clear: Good organ preservation is a major determinant of graft outcome after re-vascularisation. While the use of preservation solutions significantly improved organ preservation, it merely slows down extracorporeal ischaemic and hypoxic damage rather than preventing cellular injury. This results in primary graft dysfunction, which re-mains a major clinical problem.
Cooling is the first line of defense against hypoxic injury. Cooling is necessary to reduce cellular metabolism and the requirements for oxygen to prevent tissue injury. A great deal of information has been accumulated over the past centuries on the effects of cold on physiology and biochemistry [2]. In evolutionary terms, some animal systems (such as hibernatory mammals) have become adapted to cold exposure, which permits exploitation of environmental niches in cold parts of the planet. In nature, several key criteria have evolved which enable cold, oxygen-limited survival, and amongst these the major ones which can be linked with organ preservation for transplantation are: i) a global metabolic rate suppression, ii) metabolic pathways capable of sustaining and/or delivering minimal essential energy supplies (ATP and phenotypic al-terations to prioritise ATP use), and iii) enhancement of defense mechanisms (at the molecular and cellular levels) to allow cohesive return to normal metabolism during arousal. These basic evolutionary links across animal physiology may be one reason why human organs can tolerate cooling to some extent. However, human organs do not have the capacity to replicate all of the genotypic and phenotypic changes made by cold-tolerant animals, which is why cooling has limitations on success, with only short exposures being tolerated. In meta-bolically active cells, ATP levels are maintained constantly. However, at 4 °C, there are some metabolic rates remaining. As regards the cells, several metabolic pathways are affected: Inhibition of the Na+/K+ ATPase causes cell oedema, rapid depletion in ATP reserves, and a corresponding increase in ADP levels, and this depletion of ATP leads to the degrada-tion of adenosine causing accumulation of hypoxanthine and xanthine oxidase. Cell membrane depolarisation also occurs very early in the cascade leading to a breakdown of ion home-ostasis, and a interplay of other intracellular and membrane-associated events that eventually culminate in cell death by either apoptosis or necrosis [13].
There are problems associated with calcium movements and an increment in anaerobic glycolysis, which is mainly responsible for intracellular acidosis. An increase in the cytosolic Ca2+ concentration is thought to result from the in-hibition of the endoplasmic reticulum Ca2+ ATPase by ATP deficiency, the inhibition of the Na+/Ca2+ antiporter by in-creased cytosolic sodium concentrations (and thus decreased sodium gradients), and the function of the Na+/Ca2+ antiporter in ‘reverse mode’, by transporting calcium ‘in’ and sodium ‘out’. This triggers mitochondrial dysfunction by disrupting its membrane permeability, allowing the accumulation of calcium, sodium and water within the cell. The subsequent mem-brane injury and protein degradation have long been consid-ered to be the decisive processes toward lethal cell injury [14]. The extent of overall preservation injury is directly related to the severity of IRI after transplantation. [15]. For example, in the liver, among hepatocellular perturbations, dysfunction in liver cell volume, bile secretion, drug metabolism, and mito-chondrial function have been reported as a result of cold preservation/warm re-oxygenation [16]. As a variant, liver en-dothelial cells did not show the classic mitochondrial perme-ability transition associated with mitochondrial swelling but rather mitochondrial ultra-condensation, another mitochon-drial alteration that has been shown to be involved in diverse apoptotic pathways. However, mitochondrial fragmentation was fully reversible and appears to be largely a stress phenom-enon that is not critically involved in cold-induced injury [14].
The length of time that an organ such as kidney can be stored depends on both cooling to diminish metabolic activity and thus oxygen requirements, and on use of fluids designed to preserve the intracellular milieu in the absence of a Na+/K+ pump [17]. The exact mechanisms underlying these functional disorders are still not understood, but the involvement of intracellular Ca2+ has been proposed [18]. Accumulating evi-dence suggest perturbations at the endoplasmic reticulum as novel sub-cellular effectors that are possibly involved in pro-motion of cell death in various pathologies, including the pathophysiology of organ preservation [19].
Oxygen radicals appear to be involved in the microvascular and parenchymal cell injury in various pathologic disorders associated with organ preservation. Studies indicate that oxy-gen radicals increase microvascular permeability by creating large leakage sites predominantly in the small venules. The highly reactive hydroxyl radical appears to be responsible for the microvascular alterations associated with oxygen radical production. There is considerable indirect evidence that oxy-gen radicals are involved in the pathogenesis of circulatory shock [20]. Unlike many other types of cell injury mediated by reactive oxygen species, cold-induced cell injury (whether occurring in the cold itself or during re-warming) does not appear to be triggered by an enhanced release of the reactive oxygen species: superoxide anion radical (O2 ) or hydrogen peroxide (H2 O2 ), but by an increase in the cellular chelatable, redox-active iron pool that converts these species of low reac-tivity (and molecular oxygen) into highly reactive species, such as the hydroxyl radical or iron-oxygen species [14]. De-pending on the residual ATP level, which is related to the duration of the ischaemia, this dysfunction will manifest as apoptosis or necrosis. The ATP concentration therefore acts as a ‘switch’ between these two types of cell death. When the transition of mitochondrial membrane permeability is accom-panied by ATP depletion (prolonged ischaemia), the apop-totic signal is blocked and necrosis develops. The few molecu-lar studies of gene expression during ischaemia/re-perfusion period carried out to date focus on the re-perfusion phase.
They have highlighted the role of certain signalling pathways such as pro- or anti-apoptotic pathways like Bcl-2 and Bag-1, two regulatory genes controlling apoptosis [17] in reducing hypoxic cell damage. There is also a possible effect on the haeme oxygenase pathway (a member of the stress protein family) [21] and on the transcription factor HIF (hypoxia in-ducible factor) pathway [22]. There remain many hurdles to overcome before such evidence can be translated to improved preservation methods in clinical application, for example in gene therapy including the nature of the transfection agent and the timing of gene induction, but these will undoubtedly become the centre of attention in the near future.
By using proteomic technology, recently published reports like that of Strey et al. [23] identified several proteins impli-cated in the context of cellular adaptation mechanism in re-sponse to IRI and probed whether these molecular clues might explain the phenotypic changes expressed in the cells. Of course, further studies are needed to elucidate the rele-vance of the observed changes in the clinical setting.
Maintaining the viability of the transplant during its ischae-mic transfer from donor to recipient is mainly based on hypo-thermia, which is deliberately applied to reduce metabolic activity. Tolerated periods of cold ischaemia vary depending on the organ: 24 h for the kidney, 12–15 h for the liver, a maxi-mum of 8 h for the lung, and 6 h for the heart. Prolonged cold ischaemia is an independent risk factor for the non-function-ing or dysfunction of the transplant.