AUTHOR=Li Bryan M. , Castorina Leonardo V. , Valdés Hernández Maria del C. , Clancy Una , Wiseman Stewart J. , Sakka Eleni , Storkey Amos J. , Jaime Garcia Daniela , Cheng Yajun , Doubal Fergus , Thrippleton Michael T. , Stringer Michael , Wardlaw Joanna M. TITLE=Deep attention super-resolution of brain magnetic resonance images acquired under clinical protocols JOURNAL=Frontiers in Computational Neuroscience VOLUME=16 YEAR=2022 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2022.887633 DOI=10.3389/fncom.2022.887633 ISSN=1662-5188 ABSTRACT=

Vast quantities of Magnetic Resonance Images (MRI) are routinely acquired in clinical practice but, to speed up acquisition, these scans are typically of a quality that is sufficient for clinical diagnosis but sub-optimal for large-scale precision medicine, computational diagnostics, and large-scale neuroimaging collaborative research. Here, we present a critic-guided framework to upsample low-resolution (often 2D) MRI full scans to help overcome these limitations. We incorporate feature-importance and self-attention methods into our model to improve the interpretability of this study. We evaluate our framework on paired low- and high-resolution brain MRI structural full scans (i.e., T1-, T2-weighted, and FLAIR sequences are simultaneously input) obtained in clinical and research settings from scanners manufactured by Siemens, Phillips, and GE. We show that the upsampled MRIs are qualitatively faithful to the ground-truth high-quality scans (PSNR = 35.39; MAE = 3.78E−3; NMSE = 4.32E−10; SSIM = 0.9852; mean normal-appearing gray/white matter ratio intensity differences ranging from 0.0363 to 0.0784 for FLAIR, from 0.0010 to 0.0138 for T1-weighted and from 0.0156 to 0.074 for T2-weighted sequences). The automatic raw segmentation of tissues and lesions using the super-resolved images has fewer false positives and higher accuracy than those obtained from interpolated images in protocols represented with more than three sets in the training sample, making our approach a strong candidate for practical application in clinical and collaborative research.