Scaling Deep Spiking Neural Networks with Binary Stochastic Activations

Abstract The modern era has witnessed a proliferation of portable devices that use Artificial Intelligence (AI) to enhance user experiences. Majority of these AI tasks are performed by large neural networks, which require a good amount of memory and compute power. This has resulted in a growing interest in Spiking Neural Networks (SNNs) which communicate through binary activations or 'spikes', as they offer a bio-plausible and energy efficient alternative to traditional deep neural networks (DNNs). In this work, we present deep spiking neural networks with binary stochastic activations that are tailored for implementation on emerging hardware platforms. We evaluate two deep neural network models, VGG-9 and VGG-16 on CIFAR-10 and CIFAR-100 datasets, respectively, with binary stochastic activations. We achieve state of the accuracy and achieve 1.4x improvement in energy consumption because of spike-based communication versus a network with ReLU neurons. We further investigate extremely quantized version of these networks having binary weights and show an energy benefit of 28x over full-precision neural networks. Thus we present scalable deep spiking neural networks that achieve performance comparable to DNNs while achieving substantial energy benefit.
Authors
  • Deboleena Roy (Purdue)
  • Indranil Chakraborty (Purdue)
  • Kaushik Roy (Purdue)
Date Jul-2019
Venue 3rd IEEE International Conference on Cognitive Computing (2019)