Nowadays, exploring the brain-behavior relationship via MRI, EEG, fNIRS, and MEG has become a research hotspot further accelerated by the emergence of large-sample open-source datasets, such as UK Biobank, Human Connectome Project, the Adolescent Brain Cognitive Development, the National Institute of Mental Health (NIMH) Intramural Healthy Volunteer Dataset, the TUH EEG CORPUS, and many other multimodal datasets. Many prior studies have conducted various prediction tasks in different populations (from infants to adults; from healthy subjects to patients) with miscellaneous imaging modalities, however, to construct a precise, generalizable, and reproducible brain-behavior relationship is still facing many challenges, for example, individual variability, multi-site heterogeneity, imaging result interpretability, model generalization, low prediction performance, and lack of clinical applications. This research topic aims to promote the understanding of brain-behavior relationship using multimodal imaging and facilitate clinical applications of brain-behavior models. Submissions are encouraged to cover any of the above-mentioned questions from various perspectives.
Topics of interest include but are not limited to:
- Use of multiple modalities (either through novel methods or application of recently developed methods), including MRI, EEG, fNIRS, and MEG to explore the brain-behavior relationships, such as individual traits (e.g., cognitive ability, motor function), disease classification, disease severity, and disease prognosis in humans and animals.
- How to ascertain and alleviate possible confounding factors in constructing brain-behavior models, such as individual variability, and population heterogeneity.
- How to precisely model the brain-behavior relationship using novel multimodal methods including machine learning and deep learning approaches that have not been extensively applied.
- Approaches to improve the interpretability of brain-behavior models, such as elucidating putative brain neural mechanisms underlying specific behavior via multimodal imaging.
- Multimodal methods to detect reliable brain-behavior relationships within a relatively small dataset.
- Techniques to fuse multi-modal imaging or cross-modal information to improve brain-behavior prediction.
- Multimodal mapping of the brain’s structural connectivity using tissue microstructure and enable quantitative analysis of brain-behavior relationship in healthy subjects and patients.
- Evaluation of the brain-behavior relationship across the lifespan via multimodal imaging methods that include, e.g., both grey matter and white matter.
Nowadays, exploring the brain-behavior relationship via MRI, EEG, fNIRS, and MEG has become a research hotspot further accelerated by the emergence of large-sample open-source datasets, such as UK Biobank, Human Connectome Project, the Adolescent Brain Cognitive Development, the National Institute of Mental Health (NIMH) Intramural Healthy Volunteer Dataset, the TUH EEG CORPUS, and many other multimodal datasets. Many prior studies have conducted various prediction tasks in different populations (from infants to adults; from healthy subjects to patients) with miscellaneous imaging modalities, however, to construct a precise, generalizable, and reproducible brain-behavior relationship is still facing many challenges, for example, individual variability, multi-site heterogeneity, imaging result interpretability, model generalization, low prediction performance, and lack of clinical applications. This research topic aims to promote the understanding of brain-behavior relationship using multimodal imaging and facilitate clinical applications of brain-behavior models. Submissions are encouraged to cover any of the above-mentioned questions from various perspectives.
Topics of interest include but are not limited to:
- Use of multiple modalities (either through novel methods or application of recently developed methods), including MRI, EEG, fNIRS, and MEG to explore the brain-behavior relationships, such as individual traits (e.g., cognitive ability, motor function), disease classification, disease severity, and disease prognosis in humans and animals.
- How to ascertain and alleviate possible confounding factors in constructing brain-behavior models, such as individual variability, and population heterogeneity.
- How to precisely model the brain-behavior relationship using novel multimodal methods including machine learning and deep learning approaches that have not been extensively applied.
- Approaches to improve the interpretability of brain-behavior models, such as elucidating putative brain neural mechanisms underlying specific behavior via multimodal imaging.
- Multimodal methods to detect reliable brain-behavior relationships within a relatively small dataset.
- Techniques to fuse multi-modal imaging or cross-modal information to improve brain-behavior prediction.
- Multimodal mapping of the brain’s structural connectivity using tissue microstructure and enable quantitative analysis of brain-behavior relationship in healthy subjects and patients.
- Evaluation of the brain-behavior relationship across the lifespan via multimodal imaging methods that include, e.g., both grey matter and white matter.