Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks

Zhang, Malu, Wang, Jiadong, Wu, Jibin, Belatreche, Ammar, Amornpaisannon, Burin, Zhang, Zhixuan, Miriyala, V. P. K., Qu, Hong, Chua, Yansong, Carlson, Trevor E. and Li, Haizhou (2022) Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 33 (5). pp. 1947-1958. ISSN 2162-237X

IEEE TNNLS_Rectified-Linear-Postsynaptic-Potential-Function-for-Back-Propagation-of-Deep-Spiking-Neural-Networks.pdf - Accepted Version

Download (1MB) | Preview
Official URL:


Spiking Neural Networks (SNNs) use spatiotemporal spike patterns to represent and transmit information, which are not only biologically realistic but also suitable for ultralow-power event-driven neuromorphic implementation. Just like other deep learning techniques, Deep Spiking Neural Networks (DeepSNNs) benefit from the deep architecture. However, the training of DeepSNNs is not straightforward because the wellstudied error back-propagation (BP) algorithm is not directly applicable. In this paper, we first establish an understanding as to why error back-propagation does not work well in DeepSNNs.We then propose a simple yet efficient Rectified Linear Postsynaptic Potential function (ReL-PSP) for spiking neurons and a Spike-Timing-Dependent Back-Propagation (STDBP) learning algorithm for DeepSNNs where the timing of individual spikes is used to convey information (temporal coding), and learning (back-propagation) is performed based on spike timing in an event-driven manner. We show that DeepSNNs trained with the proposed single spike time-based learning algorithm can achieve state-of-the-art classification accuracy. Furthermore, by utilizing the trained model parameters obtained from the proposed STDBP learning algorithm, we demonstrate ultra-low power inference operations on a recently proposed neuromorphic inference accelerator. The experimental results also show that the neuromorphic hardware consumes 0.751 mW of the total power consumption and achieves a low latency of 47.71 ms to classify an image from the MNIST dataset. Overall, this work investigates the contribution of spike timing dynamics for information encoding, synaptic plasticity and decision making, providing a new perspective to the design of future DeepSNNs and neuromorphic hardware.

Item Type: Article
Additional Information: Funding information: This work was supported in part by the National Key Research and Development Program of China under Grant 2018AAA0100202, in part by Singapore Government’s Research, Innovation, and Enterprise 2020 Plan (Advanced Manufacturing and Engineering Domain) under Programmatic Grant A1687b0033 and Programmatic Grant I2001E0053, and in part by the Science and Engineering Research Council, Agency of Science, Technology, and Research, Singapore, through the National Robotics Program under Grant 192 25 00054. The work of Malu Zhang was supported in part by the National Natural Science Foundation of China under Grant 62106038 and Grant 61976043, in part by China Postdoctoral Science Foundation under Grant 2020M680148, and in part by Zhejiang Lab’s International Talent Found for Young Professionals. The work of Jibin Wu was supported in part by Zhejiang Laboratory under Grant 2019KC0AB02.
Uncontrolled Keywords: Spiking neural networks, Deep neural networks, Spike-timing-dependent learning, Event-driven, Neuromorphic hardware
Subjects: G400 Computer Science
G500 Information Systems
G700 Artificial Intelligence
Department: Faculties > Engineering and Environment > Computer and Information Sciences
Depositing User: Rachel Branson
Date Deposited: 03 Sep 2021 10:12
Last Modified: 04 May 2022 13:45

Actions (login required)

View Item View Item


Downloads per month over past year

View more statistics