The perception of sensory information involves intricate processes that span multiple scales, from molecular to cognitive levels. In computational neuroscience, understanding how these multi-scale mechanisms contribute to perception is essential for developing accurate models of brain function. Recent advancements in computational power and modeling techniques have enabled researchers to delve deeper into these complex systems, bridging the gap between micro-level neural activities and macro-level cognitive functions.
The primary goal of this Research Topic is to advance our understanding of computational models that describe multi-scale perception in neuroscience. Despite significant progress, several challenges remain, such as integrating data across different scales and creating models that accurately reflect the hierarchical nature of perception. Addressing these challenges requires collaborative efforts and the application of advanced computational techniques.
This collection aims to:
Present innovative computational models that elucidate the multi-scale nature of perception.
Facilitate an interdisciplinary dialogue between computational neuroscientists and experimentalists to validate and refine these models.
Highlight the application of these models in real-world scenarios, such as visual perception, human-computer interaction, and neural adaptation.
We invite researchers to contribute original research articles, reviews, and perspective pieces that address various aspects of multi-scale perception from a computational neuroscience standpoint. Specific themes of interest include, but are not limited to:
Computational models for visual object perception: Understanding how the brain processes visual information to recognize and interpret objects, mimicking aspects of the visual cortex and neural circuits involved in visual perception.
Multi-scale models for complex biological systems: Integrating data from multiple scales, from molecular and cellular levels to brain regions and the entire nervous system, to understand how micro-level neural activities influence macro-level cognitive functions.
Methods in image modeling and image processing: Utilizing advanced image processing techniques to analyze brain imaging data (e.g., MRI, fMRI, PET scans), which are crucial for visualizing and interpreting brain structure and function, and understanding how the brain processes visual stimuli.
Neural plasticity and adaptation models: Investigating how neural circuits adapt over time in response to sensory inputs and experiences, highlighting the dynamic nature of perception.
Multi-sensory integration models: Exploring how the brain integrates information from different sensory modalities (e.g., vision, hearing, touch) to form coherent perceptual experiences.
Hierarchical and recurrent neural networks: Employing advanced neural network architectures to model the layered and feedback processes in the brain that contribute to perception.
Predictive coding and Bayesian inference models: Examining how the brain uses prior knowledge and prediction to interpret sensory information, emphasizing the role of expectation in perception.
Applications in neuroprosthetics and brain-machine interfaces: Applying computational models to develop interfaces that can interpret neural signals and control external devices, enhancing the quality of life for individuals with sensory impairments.
Submitted manuscripts should focus on the neuroscience and computational neuroscience aspects of these themes. We are particularly interested in works that propose novel models, use advanced computational techniques, or provide empirical validation of existing theories. Authors are encouraged to employ diverse methodologies, including machine learning, network modeling, and biophysical simulation.
We look forward to receiving submissions that push the boundaries of our understanding of multi-scale perception in computational neuroscience.
Keywords:
Multi-Scale Perception, Computational Models, Neural Plasticity, Sensory Integration, Predictive Coding
Important Note:
All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.
The perception of sensory information involves intricate processes that span multiple scales, from molecular to cognitive levels. In computational neuroscience, understanding how these multi-scale mechanisms contribute to perception is essential for developing accurate models of brain function. Recent advancements in computational power and modeling techniques have enabled researchers to delve deeper into these complex systems, bridging the gap between micro-level neural activities and macro-level cognitive functions.
The primary goal of this Research Topic is to advance our understanding of computational models that describe multi-scale perception in neuroscience. Despite significant progress, several challenges remain, such as integrating data across different scales and creating models that accurately reflect the hierarchical nature of perception. Addressing these challenges requires collaborative efforts and the application of advanced computational techniques.
This collection aims to:
Present innovative computational models that elucidate the multi-scale nature of perception.
Facilitate an interdisciplinary dialogue between computational neuroscientists and experimentalists to validate and refine these models.
Highlight the application of these models in real-world scenarios, such as visual perception, human-computer interaction, and neural adaptation.
We invite researchers to contribute original research articles, reviews, and perspective pieces that address various aspects of multi-scale perception from a computational neuroscience standpoint. Specific themes of interest include, but are not limited to:
Computational models for visual object perception: Understanding how the brain processes visual information to recognize and interpret objects, mimicking aspects of the visual cortex and neural circuits involved in visual perception.
Multi-scale models for complex biological systems: Integrating data from multiple scales, from molecular and cellular levels to brain regions and the entire nervous system, to understand how micro-level neural activities influence macro-level cognitive functions.
Methods in image modeling and image processing: Utilizing advanced image processing techniques to analyze brain imaging data (e.g., MRI, fMRI, PET scans), which are crucial for visualizing and interpreting brain structure and function, and understanding how the brain processes visual stimuli.
Neural plasticity and adaptation models: Investigating how neural circuits adapt over time in response to sensory inputs and experiences, highlighting the dynamic nature of perception.
Multi-sensory integration models: Exploring how the brain integrates information from different sensory modalities (e.g., vision, hearing, touch) to form coherent perceptual experiences.
Hierarchical and recurrent neural networks: Employing advanced neural network architectures to model the layered and feedback processes in the brain that contribute to perception.
Predictive coding and Bayesian inference models: Examining how the brain uses prior knowledge and prediction to interpret sensory information, emphasizing the role of expectation in perception.
Applications in neuroprosthetics and brain-machine interfaces: Applying computational models to develop interfaces that can interpret neural signals and control external devices, enhancing the quality of life for individuals with sensory impairments.
Submitted manuscripts should focus on the neuroscience and computational neuroscience aspects of these themes. We are particularly interested in works that propose novel models, use advanced computational techniques, or provide empirical validation of existing theories. Authors are encouraged to employ diverse methodologies, including machine learning, network modeling, and biophysical simulation.
We look forward to receiving submissions that push the boundaries of our understanding of multi-scale perception in computational neuroscience.
Keywords:
Multi-Scale Perception, Computational Models, Neural Plasticity, Sensory Integration, Predictive Coding
Important Note:
All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.