AUTHOR=Rueckauer Bodo , Lungu Iulia-Alexandra , Hu Yuhuang , Pfeiffer Michael , Liu Shih-Chii TITLE=Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification JOURNAL=Frontiers in Neuroscience VOLUME=11 YEAR=2017 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2017.00682 DOI=10.3389/fnins.2017.00682 ISSN=1662-453X ABSTRACT=

Spiking neural networks (SNNs) can potentially offer an efficient way of doing inference because the neurons in the networks are sparsely activated and computations are event-driven. Previous work showed that simple continuous-valued deep Convolutional Neural Networks (CNNs) can be converted into accurate spiking equivalents. These networks did not include certain common operations such as max-pooling, softmax, batch-normalization and Inception-modules. This paper presents spiking equivalents of these operations therefore allowing conversion of nearly arbitrary CNN architectures. We show conversion of popular CNN architectures, including VGG-16 and Inception-v3, into SNNs that produce the best results reported to date on MNIST, CIFAR-10 and the challenging ImageNet dataset. SNNs can trade off classification error rate against the number of available operations whereas deep continuous-valued neural networks require a fixed number of operations to achieve their classification error rate. From the examples of LeNet for MNIST and BinaryNet for CIFAR-10, we show that with an increase in error rate of a few percentage points, the SNNs can achieve more than 2x reductions in operations compared to the original CNNs. This highlights the potential of SNNs in particular when deployed on power-efficient neuromorphic spiking neuron chips, for use in embedded applications.