Dynamic threshold integrate and fire neuron model for low latency spiking neural networks

Xiyan Wu, Yufei Zhao, Yong Song*, Yurong Jiang, Yashuo Bai, Xinyi Li, Ya Zhou, Xin Yang, Qun Hao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Spiking Neural Networks (SNNs) operate with asynchronous discrete events which enable lower power and greater computational efficiency on event-driven hardware than Artificial Neural Networks (ANNs). Conventional ANN-to-SNN conversion methods usually employ Integrate and Fire (IF) neuron model with a fixed threshold to act as Rectified Linear Unit (ReLU). However, there is a large demand for the input spikes to reach the fixed threshold and fire, which leads to high inference latency. In this work, we propose a Dynamic Threshold Integrate and Fire (DTIF) neuron model by exploiting the biological neuron threshold variability, where the threshold is inversely related to the neuron input. The spike activity is increased by dynamically adjusting the threshold at each simulation time-step to reduce the latency. Compared to the state-of-the-art conversion methods, the ANN-to-SNN conversion using DTIF model has lower latency with competitive accuracy, which has been verified by deep architecture on image classification tasks including MNIST, CAIFAR-10, and CIFAR-100 datasets. Moreover, it achieves 7.14 × faster inference under 0.44 × energy consumption than the typical method of maximum normalization.

Original languageEnglish
Article number126247
JournalNeurocomputing
Volume544
DOIs
Publication statusPublished - 1 Aug 2023

Keywords

  • ANN-to-SNN conversion
  • Image classification
  • Spiking neural networks
  • Threshold variability

Fingerprint

Dive into the research topics of 'Dynamic threshold integrate and fire neuron model for low latency spiking neural networks'. Together they form a unique fingerprint.

Cite this