In vivo Brain imaging techniques are increasingly "quantitative" in the sense that the measurement obtained is given according to a ratio scale and is intended to be the most objective. In front of the multiplication of results based on quantitative imaging, and the need to bring and sensitize the scientific community towards a science reproducible by others, open on countermeasures, works presenting advances on the comprehension of the biological, biochemical, biomechanical, structural and functional phenomena of the brain must present more and more guarantees of reliability of data extracted from quantitative imaging. The means to do this, whether from a computer point of view (which will increasingly integrate deep learning) or from complementary investigative techniques, have considerably expanded and the present research topic intends to put them on stage.
The works published in this research topic would present important results in neuroscience where quantitative in vivo imaging will have been implemented (quantitative MRI techniques, in vivo MR spectroscopy, biomedical optical imaging, quantitative PET etc…) and where the measurements will have been validated or criticized on their degree of reliability. This notion of reliability may include
1) aspects related to metrology :repeatability, and/or reproducibility of results, including in the workflow the processing methods that generate quantitative results.
2) validation techniques with countermeasures, from other in vivo imaging methods, or ex vivo and immunohistochemistry techniques.
The work will shed new light, via quantitative imaging, on the brain, its structure, its functioning, its regulation mechanisms and disorders, at different scales.
Our research topic encourages and will evaluate the availability of the data and processing pipeline so that reviewers can easily reproduce and/or inspect a paper's key results. Authors are strongly encouraged to make subset of representative data and processing publicly available, so that any interested party may audit them but this call is open to results that are not. Our team will provide the possibility to deploy and execute the processing pipelines securely and according the desired access modalities on the VIP portal (https://vip.creatis.insa-lyon.fr/) using Docker or Singularity containers and Boutiques descriptors (similarly to the guidelines available at https://gitlab.inria.fr/amasson/lesion-segmentation-challenge-miccai21/-/blob/master/SUBMISSION_GUIDELINES.md). This will greatly facilitate the reproducibility of the results by reviewers and even more largely by the community if agreed/requested by the authors
The expected validation methods can be based on objective measures (e.g., information theory error estimation, Cramer Rao theory, Bayesian methods) and may be applied to parameter inference (in optical biomedical imaging, in quantitative MRI) or focus on validation methods using complementary measurements (e.g., in vivo acquisition vs immunohistochemistry, multimodality).
In vivo Brain imaging techniques are increasingly "quantitative" in the sense that the measurement obtained is given according to a ratio scale and is intended to be the most objective. In front of the multiplication of results based on quantitative imaging, and the need to bring and sensitize the scientific community towards a science reproducible by others, open on countermeasures, works presenting advances on the comprehension of the biological, biochemical, biomechanical, structural and functional phenomena of the brain must present more and more guarantees of reliability of data extracted from quantitative imaging. The means to do this, whether from a computer point of view (which will increasingly integrate deep learning) or from complementary investigative techniques, have considerably expanded and the present research topic intends to put them on stage.
The works published in this research topic would present important results in neuroscience where quantitative in vivo imaging will have been implemented (quantitative MRI techniques, in vivo MR spectroscopy, biomedical optical imaging, quantitative PET etc…) and where the measurements will have been validated or criticized on their degree of reliability. This notion of reliability may include
1) aspects related to metrology :repeatability, and/or reproducibility of results, including in the workflow the processing methods that generate quantitative results.
2) validation techniques with countermeasures, from other in vivo imaging methods, or ex vivo and immunohistochemistry techniques.
The work will shed new light, via quantitative imaging, on the brain, its structure, its functioning, its regulation mechanisms and disorders, at different scales.
Our research topic encourages and will evaluate the availability of the data and processing pipeline so that reviewers can easily reproduce and/or inspect a paper's key results. Authors are strongly encouraged to make subset of representative data and processing publicly available, so that any interested party may audit them but this call is open to results that are not. Our team will provide the possibility to deploy and execute the processing pipelines securely and according the desired access modalities on the VIP portal (https://vip.creatis.insa-lyon.fr/) using Docker or Singularity containers and Boutiques descriptors (similarly to the guidelines available at https://gitlab.inria.fr/amasson/lesion-segmentation-challenge-miccai21/-/blob/master/SUBMISSION_GUIDELINES.md). This will greatly facilitate the reproducibility of the results by reviewers and even more largely by the community if agreed/requested by the authors
The expected validation methods can be based on objective measures (e.g., information theory error estimation, Cramer Rao theory, Bayesian methods) and may be applied to parameter inference (in optical biomedical imaging, in quantitative MRI) or focus on validation methods using complementary measurements (e.g., in vivo acquisition vs immunohistochemistry, multimodality).