Spiking Neural Networks for Object Detection Based on Integrating Neuronal Variants and Self-Attention Mechanisms

Weixuan Li, Jinxiu Zhao, Li Su*, Na Jiang, Quan Hu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Thanks to their event-driven asynchronous computing capabilities and low power consumption advantages, spiking neural networks (SNNs) show significant potential for computer vision tasks, especially in object detection. However, effective training methods and optimization mechanisms for SNNs remain underexplored. This study proposes two high accuracy SNNs for object detection, AMS_YOLO and AMSpiking_VGG, integrating neuronal variants and attention mechanisms. To enhance these proposed networks, we explore the impact of incorporating different neuronal variants.The results show that the optimization in the SNN’s structure with neuronal variants outperforms that in the attention mechanism for object detection. Compared to the state-of-the-art in the current SNNs, AMS_YOLO improved by 6.7% in accuracy on the static dataset COCO2017, and AMS_Spiking has improved by 11.4% on the dynamic dataset GEN1.

Original languageEnglish
Article number9607
JournalApplied Sciences (Switzerland)
Volume14
Issue number20
DOIs
Publication statusPublished - Oct 2024

Keywords

  • attention mechanism
  • neuronal variants
  • object detection
  • spiking neural networks

Fingerprint

Dive into the research topics of 'Spiking Neural Networks for Object Detection Based on Integrating Neuronal Variants and Self-Attention Mechanisms'. Together they form a unique fingerprint.

Cite this