Background
Spiking Neural Networks (SNN) closely imitate biological networks. Information processing occurs in both spatial and temporal manner, making SNN extremely interesting for the pertinent mimicking of the biological brain. Biological brains code and transmit the sensory information in the form of spikes that capture the spatial and temporal information of the environment with amazing precision. This information is processed in an asynchronous way by the neural layer performing recognition of complex spatio-temporal patterns with sub-milliseconds delay and at with a power budget in the order of 20W. The efficient spike coding mechanism and the asynchronous and sparse processing and communication of spikes seems to be key in the energy efficiency and high-speed computation capabilities of biological brains. SNN low-power and event-based computation make them more attractive when compared to other artificial neural networks (ANN).
During the past decades, research on efficient implementation of spiking sensors and processors has taken place. Early, spiking vision sensors emulated different types of spatial or spatio-temporal filtering and coding of the visual stimulus and suffered from poor accuracy and low resolution. Dynamic vision sensors code the temporal variation of the visual signal and result in a very efficient, high resolution spatial and temporal resolution of the signal. These sensors provide a spike-coded compressed representation of the visual scene, where spatial and temporal redundancies are suppressed. They have reached the Megapixel resolution and have provoked the interest of the industrial and academic communities on SNN systems. Recently, a collection of spike based sensors are being developed for different type of sensing modes (vision, audio, tactile as well as olfaction). In a similar way, early SNN processors were limited to low resolution convolutional processors, that have been replaced by disruptive SNN processors with many core processors, including neurons with programmable parameters and behavioral models, and sophisticated spike routers for programmable and reconfigurable synaptic connections.
Despite the higher potential of SNN to solve in an efficient way, complex perception problem involving signals with complex spatio-temporal behavior, like gesture recognition, moving object tracking, audio recognition, etc… they still underperform ANN even in more simple problems like MNIST digit classification. The reason is that SNN training methods have to deal with more complexity coded signals. ANN training methods are not directly applicable to SNN. Furthermore, SNN training is directly dependent on the spike coding representation. Training of SNN can be supervised or unsupervised, using different algorithms that reproduce different occurrences. Among the supervised SNN training methods some of the are based Back-Propagation-Through-Time (BPTT) however they require keeping track of the neuron membrane potential along time requiring large memory and computation capabilities. Recently, different approximation methods for computing weights updates using local information have been proposed. Implementing and improving models and training for SNN is crucial for the improvement of both real-life and mobile applications in the field of Neuromorphic computing.
Goal
In this Research Topic, Frontiers in Neuroscience – Neuromorphic Engineering is looking to address implementations of spike-based learning in order to promote novel ideas and advance the field of research.
Scope and information for Authors
We particularly welcome submission of Original Research, Review, Methods, and Perspective articles on the following (but not limited to) sub-topics:
? Efficient spike coding representation of spatio-temporal signal;
? Efficient spike-based sensors and sensor fusion;
? Neuromorphic architectures for implementing spike-based learning rules;
? Spike-based cognitive algorithm;
? Neurobiology-inspired algorithms;
? Spike-based training methods;
? Hardware efficient spiking algorithms;
? Associative memory;
? Federated learning using spiking neural networks;
? Multi-timescale learning;
Background
Spiking Neural Networks (SNN) closely imitate biological networks. Information processing occurs in both spatial and temporal manner, making SNN extremely interesting for the pertinent mimicking of the biological brain. Biological brains code and transmit the sensory information in the form of spikes that capture the spatial and temporal information of the environment with amazing precision. This information is processed in an asynchronous way by the neural layer performing recognition of complex spatio-temporal patterns with sub-milliseconds delay and at with a power budget in the order of 20W. The efficient spike coding mechanism and the asynchronous and sparse processing and communication of spikes seems to be key in the energy efficiency and high-speed computation capabilities of biological brains. SNN low-power and event-based computation make them more attractive when compared to other artificial neural networks (ANN).
During the past decades, research on efficient implementation of spiking sensors and processors has taken place. Early, spiking vision sensors emulated different types of spatial or spatio-temporal filtering and coding of the visual stimulus and suffered from poor accuracy and low resolution. Dynamic vision sensors code the temporal variation of the visual signal and result in a very efficient, high resolution spatial and temporal resolution of the signal. These sensors provide a spike-coded compressed representation of the visual scene, where spatial and temporal redundancies are suppressed. They have reached the Megapixel resolution and have provoked the interest of the industrial and academic communities on SNN systems. Recently, a collection of spike based sensors are being developed for different type of sensing modes (vision, audio, tactile as well as olfaction). In a similar way, early SNN processors were limited to low resolution convolutional processors, that have been replaced by disruptive SNN processors with many core processors, including neurons with programmable parameters and behavioral models, and sophisticated spike routers for programmable and reconfigurable synaptic connections.
Despite the higher potential of SNN to solve in an efficient way, complex perception problem involving signals with complex spatio-temporal behavior, like gesture recognition, moving object tracking, audio recognition, etc… they still underperform ANN even in more simple problems like MNIST digit classification. The reason is that SNN training methods have to deal with more complexity coded signals. ANN training methods are not directly applicable to SNN. Furthermore, SNN training is directly dependent on the spike coding representation. Training of SNN can be supervised or unsupervised, using different algorithms that reproduce different occurrences. Among the supervised SNN training methods some of the are based Back-Propagation-Through-Time (BPTT) however they require keeping track of the neuron membrane potential along time requiring large memory and computation capabilities. Recently, different approximation methods for computing weights updates using local information have been proposed. Implementing and improving models and training for SNN is crucial for the improvement of both real-life and mobile applications in the field of Neuromorphic computing.
Goal
In this Research Topic, Frontiers in Neuroscience – Neuromorphic Engineering is looking to address implementations of spike-based learning in order to promote novel ideas and advance the field of research.
Scope and information for Authors
We particularly welcome submission of Original Research, Review, Methods, and Perspective articles on the following (but not limited to) sub-topics:
? Efficient spike coding representation of spatio-temporal signal;
? Efficient spike-based sensors and sensor fusion;
? Neuromorphic architectures for implementing spike-based learning rules;
? Spike-based cognitive algorithm;
? Neurobiology-inspired algorithms;
? Spike-based training methods;
? Hardware efficient spiking algorithms;
? Associative memory;
? Federated learning using spiking neural networks;
? Multi-timescale learning;