About this Research Topic
Experimental techniques in most neural systems have limitations in directly measuring the effect of plasticity at a systems level with distributed spatial and temporal timescales. Theoretical and computational models can help bridge this gap, though traditional approaches each tend to present difficulties of their own:
1. Top-down approaches define an objective function to derive learning rules and network structure, but it is not always clear how to define an objective function for a particular task.
2. Bottom-up approaches use experimental findings on plasticity-based learning rules to study the consequences at the network level, but such approaches are not always capable of efficiently yielding networks accomplishing certain tasks.
3. Machine learning approaches using gradient-based or reward-based learning have successfully provided new hypotheses for network mechanisms, but their neuroscience interpretation is challenging as they do not appropriately constrain the solution space and often lack desired features of biological learning such as generalization from few examples, flexible switching to new tasks, active learning, and robustness.
These examples illustrate the need for multi- and interdisciplinary efforts to understand the principles of biological learning, which can guide appropriate treatments for neurological and neurodegenerative disorders and potentially lead to improved AI applications for society.
This Research Topic focuses on Research or Review articles that strive to reveal computational principles of biological learning and their implications for neural codes and neural dynamics. It is expected that much of this work will be multi-scale in scope, elucidating how synaptic or cellular scale learning rules underlie systems-level changes in coding and dynamics. Experimental, computational, or theoretical studies are welcome. Studies that are able to bridge across multiple modalities of structural and functional data are particularly encouraged. We also value contributions drawing from efforts in Artificial Intelligence that respect the basic constraints of neurons and synapses, and help illuminate principles of biological learning. While we expect the majority of the papers to be Research or Review articles, we also welcome Perspective or Opinion pieces that articulate strategies that the community should adopt as we further our understanding of biological learning.
Keywords: Adaptive behavior, Behavioral felxibility, Multiscale neural dynamics, plasticity, Generalization from a few examples, Continual learning, Life-long learning, Robustness, Credit assignment, Biophysical constraints, Efficient learning, Transfer learning, NeuroAI
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.