About this Research Topic
One possible principle that has been put forward is symmetry. Formally, symmetry is the property of an object (physical or mathematical) of remaining unchanged (invariant) under a set of operations or transformations. In the context of vision, and in particular for object categorization, it consists of the brain’s ability to generate a representation of the essential structural character of an object which is invariant to transformations in appearance from many different points of view. More generally, invariance (or covariance) to input symmetries is oftentimes a highly desirable property of any information processor or model, the brain being an example.
The problem of understanding if and how symmetries, in sensory input or their statistics, define important aspects of the brain's information processing, has gained a lot of interest recently, notably due to the surge of deep convolutional neural networks. To date, deep convolutional networks are considered the best models in predicting neural responses, suggesting that the brain may use analogous algorithms. However, these are neurally implausible because of several limitations. New computational models that go beyond convolutional ones and incorporate local synaptic constraints and time contiguity, together with good robustness and selective properties are needed. Those models will most probably extend to other senses such as touch, hearing, and olfaction, defining a new guiding principle for understanding how the brain efficiently compresses and uses high-dimensional information through symmetry.
In parallel, Artificial neural networks (ANNs) are also an efficient data-driven models that allow us to capture complex nonlinear input-output relationships for a variety of tasks, from estimating the ground state of a quantum many-body system to decoding neuronal activity in the brain in response to sensory stimuli. This class of algorithms is naturally biased towards learning invariant or covariant representations with respect to input symmetries; however the mechanisms that enforce such properties still remain to be elucidated.
The goal of this Research Topic is to cover promising, recent, and novel research trends regarding the role of symmetry in shaping the learning/information processing in real and artificial neural networks, within a theoretical and computational perspective, with applications to machine learning and neuroscience but also to other disciplines where symmetry plays an important role. We welcome submissions of high-quality research and review articles.
Areas to be covered in this Research Topic may include, but are not limited to:
● Neuroscience & Cerebral Cortex
Invariant/covariant representations in Neuronal Populations.
Symmetry and efficient information compression of sensory stimuli.
Relevance of symmetry concepts to higher brain function: abstractions.
Plasticity rules and symmetry.
● Deep Neural Networks
Implicit network bias and its relation to invariant representations.
Learning dynamics of invariant/covariant representations.
New architectures that encourage or enforce symmetries (beyond translations).
Sample complexity for learning invariant representations.
Data augmentations.
● Applications in the Natural Sciences
Physics: physically informed networks; models in particle physics, quantum many body systems.
Chemistry: models of molecules, proteins that enforce physical symmetries.
Keywords: Plasticity, Symmetry, Invariance, Equivariance, Artificial Neural Networks
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.