TY - JOUR
T1 - Dynamic threshold integrate and fire neuron model for low latency spiking neural networks
AU - Wu, Xiyan
AU - Zhao, Yufei
AU - Song, Yong
AU - Jiang, Yurong
AU - Bai, Yashuo
AU - Li, Xinyi
AU - Zhou, Ya
AU - Yang, Xin
AU - Hao, Qun
N1 - Publisher Copyright:
© 2023 Elsevier B.V.
PY - 2023/8/1
Y1 - 2023/8/1
N2 - Spiking Neural Networks (SNNs) operate with asynchronous discrete events which enable lower power and greater computational efficiency on event-driven hardware than Artificial Neural Networks (ANNs). Conventional ANN-to-SNN conversion methods usually employ Integrate and Fire (IF) neuron model with a fixed threshold to act as Rectified Linear Unit (ReLU). However, there is a large demand for the input spikes to reach the fixed threshold and fire, which leads to high inference latency. In this work, we propose a Dynamic Threshold Integrate and Fire (DTIF) neuron model by exploiting the biological neuron threshold variability, where the threshold is inversely related to the neuron input. The spike activity is increased by dynamically adjusting the threshold at each simulation time-step to reduce the latency. Compared to the state-of-the-art conversion methods, the ANN-to-SNN conversion using DTIF model has lower latency with competitive accuracy, which has been verified by deep architecture on image classification tasks including MNIST, CAIFAR-10, and CIFAR-100 datasets. Moreover, it achieves 7.14 × faster inference under 0.44 × energy consumption than the typical method of maximum normalization.
AB - Spiking Neural Networks (SNNs) operate with asynchronous discrete events which enable lower power and greater computational efficiency on event-driven hardware than Artificial Neural Networks (ANNs). Conventional ANN-to-SNN conversion methods usually employ Integrate and Fire (IF) neuron model with a fixed threshold to act as Rectified Linear Unit (ReLU). However, there is a large demand for the input spikes to reach the fixed threshold and fire, which leads to high inference latency. In this work, we propose a Dynamic Threshold Integrate and Fire (DTIF) neuron model by exploiting the biological neuron threshold variability, where the threshold is inversely related to the neuron input. The spike activity is increased by dynamically adjusting the threshold at each simulation time-step to reduce the latency. Compared to the state-of-the-art conversion methods, the ANN-to-SNN conversion using DTIF model has lower latency with competitive accuracy, which has been verified by deep architecture on image classification tasks including MNIST, CAIFAR-10, and CIFAR-100 datasets. Moreover, it achieves 7.14 × faster inference under 0.44 × energy consumption than the typical method of maximum normalization.
KW - ANN-to-SNN conversion
KW - Image classification
KW - Spiking neural networks
KW - Threshold variability
UR - http://www.scopus.com/inward/record.url?scp=85159100110&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2023.126247
DO - 10.1016/j.neucom.2023.126247
M3 - Article
AN - SCOPUS:85159100110
SN - 0925-2312
VL - 544
JO - Neurocomputing
JF - Neurocomputing
M1 - 126247
ER -