AUTHOR=Yousefzadeh Amirreza , Stromatias Evangelos , Soto Miguel , Serrano-Gotarredona Teresa , Linares-Barranco Bernabé TITLE=On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights JOURNAL=Frontiers in Neuroscience VOLUME=12 YEAR=2018 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2018.00665 DOI=10.3389/fnins.2018.00665 ISSN=1662-453X ABSTRACT=

In computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the computing, communication, and energy resources. When it comes to hardware engineering, a key question is always to find the minimum number of necessary bits to keep the neurocomputational system working satisfactorily. Here we present some techniques and results obtained when limiting synaptic weights to 1-bit precision, applied to a Spike-Timing-Dependent-Plasticity (STDP) learning rule in Spiking Neural Networks (SNN). We first illustrate the 1-bit synapses STDP operation by replicating a classical biological experiment on visual orientation tuning, using a simple four neuron setup. After this, we apply 1-bit STDP learning to the hidden feature extraction layer of a 2-layer system, where for the second (and output) layer we use already reported SNN classifiers. The systems are tested on two spiking datasets: a Dynamic Vision Sensor (DVS) recorded poker card symbols dataset and a Poisson-distributed spike representation MNIST dataset version. Tests are performed using the in-house MegaSim event-driven behavioral simulator and by implementing the systems on FPGA (Field Programmable Gate Array) hardware.