Advances in hematopoietic stem cell transplant (HSCT) have led to changes in the approach to donor selection. Many of these new approaches result in greater HLA loci mismatching, either through the selection of haploidentical donors or permissive HLA mismatches. Although these approaches increase the potential of transplant for many patients by expanding the number of acceptable donor HLA genotypes, they add the potential barrier of donor-specific HLA antibodies (DSA). DSA presents a unique challenge in HSCT, as it can limit engraftment and lead to graft failure. However, transient reduction of HLA antibodies through desensitization treatments can limit the risk of graft failure and facilitate engraftment. Thus, the consideration of DSA in donor selection and the management of DSA prior to transplant are playing an increasingly important role in HSCT. In this review, we will discuss studies addressing the role of HLA antibodies in HSCT, the reported impact of desensitization on DSA levels, and the implications for selecting donors for patients with DSA. We found that there is a clear consensus that moderate strength DSA should be avoided, while desensitization strategies are reported to be effective in most cases at reducing DSA to amenable levels. There is limited information regarding the impact of specific characteristics of DSA, such as HLA loci or overall level of sensitization, which could further aid in donor selection for sensitized HSCT candidates.
Class I Human Leukocyte Antigen (HLA) evolutionary divergence (HED) is a metric which reflects immunopeptidome diversity and has been associated with immune checkpoint inhibitor responses in solid tumors. Its impact and interest in allogeneic hematopoietic stem cell transplantation (HCT) have not yet been thoroughly studied. This study analyzed the clinical and immune impact of class I and II HED in 492 acute myeloid leukemia (AML) recipients undergoing HCT. The overall cohort was divided into a training (n=338) and a testing (n=132) set. Univariate cox screening found a positive impact of a high class I HED and a negative impact of a high class II HED on both disease-free (DFS) and overall survival (OS). These results were combined in a unique marker, class I/class II HED ratio, and assessed in the testing cohort. The final multivariate cox model confirmed the positive impact of a high versus low class I/class II HED ratio on both DFS (Hazard Ratio (HR) 0.41 [95% CI 0.2-0.83]; p=0.01) and OS (HR 0.34 [0.19-0.59]; p<0.001), independently of HLA matching and other HCT parameters. No significant association was found between the ratio and graft-versus-host disease (GvHD) nor with neutrophil and platelet recovery. A high class I HED was associated with a tendency for an increase in NK, CD8 T-cell, and B cell recovery at 12 months. These results introduce HED as an original and independent prognosis marker reflecting immunopeptidome diversity and alloreactivity after HCT.
Frontiers in Immunology
PTCY and Allo-HCT: A Deep Dive into Outcomes, Toxicities, and Patient-Centered Care