Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The first successful cord blood transplantation (CBT) was performed by Dr. Gluckman, Dr. Broxmeyer, and their colleagues for a patient with Fanconi anemia in 1989. Since then, the application of CBT has broadened to include both malignant and nonmalignant diseases, and cord blood has emerged as an essential alternative donor source in the practice of hematopoietic cell transplantation (HCT). CBT has become especially important because of the lack of a matched sibling donor for the majority of patients needing an allogeneic HCT, and the underrepresentation of many ethnicities in large donor registries such as the National Marrow Donor Program (NMDP) and its cooperative international registries.
CBT offers many advantages; cord blood is a rich source of hematopoietic stem cells and progenitor cells with potent self-renewal, proliferation, homing, and hematopoietic reconstitution capacity. Moreover, cord blood is characterized by the ease of its collection with relatively no risk on the mother or the newborn. Because it is cryopreserved, it is available in a timely manner. In fact, it has been shown that cord blood units that are mismatched at one or two human leukocyte antigen (HLA) loci can be readily provided for almost all patients who are younger than 20 years of age and 80% or more for patients of older age. This allows patients to proceed with HCT rapidly, usually within a timeline of 2 to 4 weeks compared to slower time frames for unrelated donor sources. Other advantages of CBT include the tolerability of HLA mismatch in the recipients and the low associated risk of graft-versus-host disease (GVHD) while still possessing potent graft-versus-malignancy effects. On the other hand, some of the historical disadvantages of CBT include an increased risk of graft rejection, a higher risk of infection and subsequently has been associated with increased nonrelapse mortality (NRM) in some series. However, significant advances have been made in the techniques and practices of STC in general and CBT in particular, which have substantially improved the outcomes of patients undergoing CBT to levels comparable to those following related and unrelated bone marrow and peripheral blood transplants. Indeed, the performance of HCT according to the donor type was investigated by a large study based of the European Socisety for Blood and Marrow Transplantation (EBMT) registry which included 106,188 HCT recipients, with over 3000 CBT recipients, between 2001 and 2015, showing a significant improvement in the outcomes of HCT in the time period between 2010 and 2015 when compared to that between 2006 and 2010.
The comparison between the outcomes of various types of HCT has been reported by many studies in various disease settings. A large retrospective single-institution study compared CBT, matched unrelated donor (MUD) transplant, and mismatched unrelated donor (MMUD) transplant in patients with acute leukemia and myelodysplastic syndrome (MDS) with pretransplantation minimal residual disease undergoing a myeloablative HCT. The study demonstrated that CBT was associated with an overall survival (OS) rate as favorable as that associated with MUD transplant and significantly higher than that associated with MMUD transplant. Furthermore, CBT was associated with a lower relapse probability than both MUD or MMUD HCT. Another prospective study in adult patients with leukemia and MDS reported the outcomes of MUD in 91 recipients and CBT in 119 recipients. In this study, MUD and CBT were associated with similar OS, NRM and relapse rates, supporting the viability of cord blood as an integral alternative graft source. A recent landmark study from the acute leukemia working party (ALWP) and EBMT reported on the long-term outcomes of patients with acute myeloid leukemia (AML) who are alive and free of disease 2 years following allogeneic HCT with either CBT or unrelated donor highlighted that the donor source did not impact the long-term outcomes of AML. The study included 364 CBT recipients, 2648 MUD with 10/10 HLA matching recipients and 681 MUD recipients with 9/10 HLA matching. With a median follow-up of 6 years, the 5-year leukemia-free survival (LFS) from HCT was 86% in CBT recipients, 84% in MUD 10/10 recipients, and 84% in MUD 9/10 recipients. On multivariate analysis, graft source did not affect LFS. The incidence of relapse and NRM were not different between recipients of various graft sources. The study showed that while the patient and disease characteristics did impact LFS, the donor type itself did not.
However, in another study when CBT was compared to MMUD (9/10 HLA matching) where posttransplant cyclophosphamide was used for patients with AML, CBT was found to be associated with lower LFS, OS, and GVHD-free relapse-free survival owing to a higher NRM when compared to MMUD. Recently, the Blood and Marrow Transplant Clinical Trials Network reported the results of a randomized controlled trial of CBT versus haploidentical HCT. While the PFS, the primary endpoint of the study, was similar between the two groups; the authors noted that at this point, haploidentical transplant is favored over CBT in light of findings from secondary endpoints including OS.
The immune properties of umbilical cord blood endow it with particular immune reconstitution kinetics that might yield better outcomes following CBT. A retrospective analysis of immune reconstitution kinetics following HCT from bone marrow, peripheral blood, or umbilical cord blood showed that differentiated natural killer (NK) cells and mature B-cells are significantly increased after CBT. Furthermore, in a multivariate analysis, a higher CD16+ CD57- NK-cell count correlated with a lower incidence of relapse, whereas higher CD20+ B-cell and CD8+ CD11b- T-cell counts were associated with lower NRM. This suggests that the specific immune reconstitution events that occur following CBT may contribute to better outcomes of CBT. Similarly, a study of immune recovery in 106 adults undergoing CBT demonstrated a robust recovery of CD4+ T cells, which was associated with reduced risk of mortality.
Recently, both the American Society for Transplantation and Cellular Therapy (ASTCT), as well as the NMDP/the Center for International Blood and Marrow Transplantation Research (CIBMTR), have published guidelines for optimal umbilical cord blood unit selection for CBT. Publication of these guidelines may assist in facilitating more widespread use of CBT.
The cell dose has a significant impact on the outcomes of CBT. While the total nucleated cell (TNC) dose is a well-established parameter for successful engraftment and survival following CBT, the CD34+ cell dose is the most reliable parameter for predicting engraftment. Therefore both the TNC and the CD34+ cell doses should be considered together for optimal unit selection. The randomized phase III trial BMT CTN 0501 defined an adequate single unit cryopreserved TNC dose as > 2.5 × 10 7 /kg. Another large registry study of over 1500 myeloablative single-unit CBT suggested the use of a higher TNC dose of > 3.0 × 10 7 /kg as it demonstrated an increased transplant-related mortality (TRM) with the use of a lower cord blood unit TNC dose. The National Cord Blood Program of the New York Blood Center, suggests the use of a higher TNC dose to offset the effect of increasing HLA-mismatch, through an analysis of over 1000 recipients of CBT from a single bank. Of note, ASTCT recommends an even higher TNC dose in the range > 4.0 to 5.0 × 10 7 /kg for nonmalignant diseases. The general consensus in malignant disease is the recommendation of a TNC dose of greater than 3.0 × 10 7 /kg.
The CD34+ cell dose is also critically important in the unit selection. Existing U.S. and Eurocord guidelines agree on a minimum threshold CD34+ cell dose of 1.5 × 10 5 /kg for single-unit grafts. In double CBT (dCBT), Purtill et al. demonstrated the critical role of the infused viable CD34+ cell dose of the dominant unit in determining the speed and success of neutrophil engraftment and showed that a dominant unit infused viable CD34+ cell dose < 0.5 × 10 5 /kg was associated with significant impairment in neutrophil engraftment.
The relatively naïve immune cells and the subsequent lower frequency of alloreactive T cells in cord blood lead to a lower incidence of GVHD with CBT. Therefore less stringent HLA matching between the cord blood units and the recipient is required as compared with other graft sources. In a study of single CBT, Barker et al. showed that recipients of 6/6 matched cord blood units had the lowest TRM regardless of the dose, followed by 5/6 matched units with TNC dose > 2.5×10 7 /kg or 4/6 matched units with TNC dose > 5.0×10 7 /kg, and 5/6 matched units with lower TNC dose (< 2.5×10 7 /kg). These findings reinforce the importance of using both the TNC dose and HLA-matching level in cord blood unit selection.
A study by the CIBMTR and the Eurocord reported better outcomes in single CBT after myeloablative conditioning (MAC) with higher-resolution allele-level matching for four HLA loci (-A, -B, -C, and -DRB1) (8-allele HLA match grade). The study showed a lower frequency of neutrophil recovery for recipients of mismatches at 3–5 but not at 1–2 alleles compared with those of HLA-matched units. Moreover, NRM was higher with units mismatched at 1–5 alleles compared with matched units. This retrospective study as well as findings from a study conducted at our institution demonstrating an increased TRM with increasing HLA-mismatch, confirm the importance of a stricter HLA requirement in CBT. Despite prior studies, reporting that a high degree of HLA-mismatch did not adversely affect transplant outcomes, the U.S. and European guidelines recommend using a minimum of 8-allele HLA match grade for cord blood unit selection.
Unit quality is an essential consideration in cord blood selection and is determined by the cord blood bank practices as reflected by cryopreservation and processing procedures. Hence, standardization and accreditation of cord blood banks by the Foundation for the Accreditation of Cellular Therapy (FACT) are important. The U.S. Food and Drug Administration licensure is also of importance as its regulation ensures safety and reliability.
Other quality control measures include the cryopreservation volume, the red blood cell content, the year of cord blood collection, and the post-thaw CD34+ viability, among others. Red blood cell replete units are no longer recommended to be used because of their association with infusion reactions. Moreover, many centers prefer the use of more recently collected cord blood units because of the optimization of processing and cryopreservation techniques over the past decade. Finally, the post-thaw CD34+ cell viability is a very important consideration. A number of studies have shown the importance of post-thaw colony-forming unit (CFU) dose in successful engraftment. Cord blood units with a low proportion of post-thaw viable CD34+ cells have been associated with very poor engraftment potential. The NetCord-FACT specifications require a minimum thawed CD34+ viability of 70% and the ASTCT recommends preferably an even higher percentage.
While the aforementioned criteria are critical for the selection of individual cord blood units to ensure successful engraftment and improve outcomes and prognosis, integrating them together into unified scores is important, especially as meeting all ideal criteria might not be always practical. Kondo et al. developed a cord blood index (CBI) scoring system that accounts for TNC dose, CD34+ cell dose, the number of HLA mismatches at the antigen and allele levels, and CFUs for granulocytes/macrophages, and validated it using large cohorts of CBT recipients and demonstrated that it can reliably predict neutrophil and platelet engraftment, and early NRM for adult patients, and may be a useful tool to guide transplanters in the choice of the best cord blood unit.
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here