Skip to main content

EDITORIAL article

Front. Cell. Neurosci., 20 December 2022
Sec. Cellular Neurophysiology
This article is part of the Research Topic Brain-Inspired Computing: Neuroscience Drives the Development of New Electronics and Artificial Intelligence View all 5 articles

Editorial: Brain-inspired computing: Neuroscience drives the development of new electronics and artificial intelligence

  • 1Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena, Italy
  • 2Department of Engineering “Enzo Ferrari,” University of Modena and Reggio Emilia, Modena, Italy
  • 3Center for Neuroscience and Neurotechnology, University of Modena and Reggio Emilia, Modena, Italy
  • 4Centre for Electronics Frontiers, School of Engineering, University of Edinburgh, Edinburgh, United Kingdom
  • 5Neuroscience Area, International School of Advanced Studies (SISSA), Trieste, Italy

The advent of artificial intelligence (AI) applications in everyday life has continuously raised the need for the deployment of advanced machine learning systems like artificial neural networks, which in increasingly numerous tasks outperform humans. Moreover, the development of humanoid robotics platforms showing intelligent behaviors has boosted research on engineering systems that mimic neural functions. Nonetheless, solutions based on conventional hardware with limited energy efficiency are poorly sustainable, require repeated training cycles, depend on supervised learning rules, and most importantly must be trained offline on vert large datasets to perform online on the desired tasks. All these factors, severely limit the pervasive distribution of AI sustainable solutions. In the recent years, a large number of industrial applications based on classic computational paradigms and conventional technologies have emerged. In this scenario, a significant new opportunity is represented by alternative, sustainable strategies inspired by the ways the human brain works, known as neuromorphic.

The term neuromorphic is strongly interdisciplinary and encompasses many fields spanning from material sciences to electronic architectures and software models, all sharing a common framework dictated by some analogy with brain functions. On the hardware side, the development of bio-inspired microelectronics dates back to the 80's (Mead, 1989) and was grounded in the analogy between the biophysics at work in biological neuronal membranes and the charge carriers behavior of semiconductor transistors in sub-threshold regime (Furber, 2016). Many scientific works have appeared since those initial seminal papers, testifying progresses along four main directions:

i) Materials: Novel nanomaterials proposed as candidates for neuromorphic computing devices, showing functionalities that better mimic neuronal behaviors (Sangwan and Hersam, 2020);

ii) Synapses: Memristive devices, as alternatives to conventional data storage concepts, capable of reproducing peculiar characteristics of biological synaptic long-term plasticity in both supervised (Florini et al., 2022) and unsupervised regimes (Serb et al., 2016);

iii) Neurons: Silicon neurons as the main building block of novel architectures, implementing systems reacting in real-time or bidirectional brain-machine interfaces, where the complexity of their design scales up with the level of detail of neuronal representation (Indiveri et al., 2011);

iv) Architecture: Spiking neural networks (SNN) as sustainable solutions, mimicking brain circuits operations, to efficiently perform distributed computation. The development of silicon architectures supporting SNN is one of the main cores of neuromorphic computing research (George et al., 2020). In a broader perspective, electronic architecture in the context of AI is a system that has been designed and optimized for implementing in hardware conventional artificial neural networks or alternatively to emulate spiking networks (Rose et al., 2021).

On the software side, much of the effort in the development of neuromorphic solutions has been devoted to the implementation of AI algorithms that could draw inspiration from neuronal circuit architectures. Conversely, it is widely accepted that despite an impressive number of papers published in the last two decades proposing AI solutions based on ANN, the growth rate of deployed applications is decreasing. There is a general agreement that alternatives must be explored and the brain, through its spike-based operations, still remains the main target given its impressive computational capacity on top of limited metabolic energy budgets.

In most cases, these research directions are undertaken separately despite a common thread, inspired by the same neural principles. This joint Frontiers in Cellular Neuroscience and Frontiers in Neuroscience topic was launched to gather new developments in the neuromorphic field and, at the same time, to explore how extensive reciprocal contaminations are. The first example of this attempt is shown in the work by Lee et al., which introduces an analogy between one of the most widely employed techniques adopted to regularize artificial neural networks (ANN) and a real biological architecture. Authors propose that the serotoninergic axons satisfy most of the requirements for the ANN parallelization and as a proof-of-concept they demonstrate that serotoninergic fibers through their stochasticity can potentially implement a dropout mechanism that supports neuroplasticity. This example suggests how coupling a deeper investigation of neuronal, functional and structural properties can lead to improved AI performance. In an attempt of practical applications for bio-inspired SNN, Vijayan and Diwakar present a cerebellum-inspired SNN architecture that can efficiently perform pattern classification with an accuracy comparable to state-of-the-art machine learning algorithms. The same algorithm, inspired by the physiology of cerebellar network, allowed them to reconstruct the kinematic parameters of a robotic arm to be controlled with a low error rate. This work shows the potential of SNNs and their versatility for different applications. On the same theme, Bittar and Garner fully explore the potential of different spiking neuron models in SNN, showing how the integration of different network architectures and training algorithms lead to intriguing perspectives in classification tasks such as speech recognition. Moving to the hardware level, Baroni et al. offer an energy efficient electronic architecture, based on resistive switching memory crossbars that forms a convenient primitive for matrix-vector multiplication in a single computational step. A similar improvement in performances can boost ANNs' speed of operation, with ideal applications in big-data related fields like biomedical or genomics research.

We thank the authors for their excellent contributions to this Research Topic and we hope that this collection provides the first step for a series of cross-disciplinary works on brain-inspired computing, combining software and hardware, and more generally linking conventional AI to bio-inspired systems. We look forward to continuing the discussion in the field.

Author contributions

JM wrote the first draft of the manuscript. All authors contributed equally to the editing of the Research Topic and conceptualization of the manuscript. All authors contributed to revisions and approved the submitted version.

Funding

JM acknowledges funding support from the EU Horizon 2020 Framework Program for Research and Innovation (Specific Grant Agreement 945539—the Flag ERA JTC 2019).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Florini, D., Gandolfi, D., Mapelli, J., Benatti, L., Pavan, P., and Puglisi, F. M. (2022). A hybrid cmos-memristor spiking neural network supporting multiple learning rules. IEEE Trans. Neural Netw. Learn. Syst. (2022) 1–13. doi: 10.1109/TNNLS.2022.3202501

PubMed Abstract | CrossRef Full Text | Google Scholar

Furber, S. (2016). Large-scale neuromorphic computing systems. J. Neural. Eng. 13, 051001. doi: 10.1088/1741-2560/13/5/051001

PubMed Abstract | CrossRef Full Text | Google Scholar

George, R., Chiappalone, M., Giugliano, M., Levi, T., Vassanelli, S., Partzsch, J., et al. (2020). Plasticity and adaptation in neuromorphic biohybrid systems. Iscience 23, 101589. doi: 10.1016/j.isci.2020.101589

PubMed Abstract | CrossRef Full Text | Google Scholar

Indiveri, G., Linares-Barranco, B., Hamilton, T. J., van Schaik, A., Etienne-Cummings, R., Delbruck, T., et al. (2011). Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 73. doi: 10.3389/fnins.2011.00073

PubMed Abstract | CrossRef Full Text | Google Scholar

Mead, C. (1989). Analog VLSI and Neural Systems. Reading, MA: Addison-Wesley.

Google Scholar

Rose, G. S., Shawkat, M. S. A., Foshie, A. Z., Murray, J. J., and Adnan, M. M. A. (2021). system design perspective on neuromorphic computer processors. Neuromor. Comput. Eng. 1, 022001. doi: 10.1088/2634-4386/ac24f5

CrossRef Full Text | Google Scholar

Sangwan, V. K., and Hersam, M. C. (2020). Neuromorphic nanoelectronic materials. Nat. Nanotechnol. 15, 517–528. doi: 10.1038/s41565-020-0647-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Serb, A., Bill, J., Khiat, A., Berdan, R., Legenstein, R., Prodromakis, T., et al. (2016). Unsupervised learning in probabilistic neural networks with multi-state metal-oxide memristive synapses. Nat. Commun. 7, 12611. doi: 10.1038/ncomms12611

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: neuromorphic, neuro-inspired computing, computational neuroscience, bio-inspired artificial intelligence, electronic

Citation: Gandolfi D, Puglisi FM, Serb A, Giugliano M and Mapelli J (2022) Editorial: Brain-inspired computing: Neuroscience drives the development of new electronics and artificial intelligence. Front. Cell. Neurosci. 16:1115395. doi: 10.3389/fncel.2022.1115395

Received: 03 December 2022; Accepted: 06 December 2022;
Published: 20 December 2022.

Edited and reviewed by: Enrico Cherubini, European Brain Research Institute, Italy

Copyright © 2022 Gandolfi, Puglisi, Serb, Giugliano and Mapelli. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jonathan Mapelli, yes jonathan.mapelli@unimore.it

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.