PFFNET: A Fast Progressive Feature Fusion Network for Detecting Drones in Infrared Images

Ziqiang Han, Cong Zhang*, Hengzhen Feng, Mingkai Yue, Kangnan Quan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

The rampant misuse of drones poses a serious threat to national security and human life. Currently, CNN (Convolutional Neural Networks) are widely used to detect drones. However, small drone targets often reduced amplitude or even lost features in infrared images which traditional CNN cannot overcome. This paper proposes a Progressive Feature Fusion Network (PFFNET) and designs a Pooling Pyramid Fusion (PFM) to provide more effective global contextual priors for the highest downsampling output. Then, the Feature Selection Model (FSM) is designed to improve the use of the output coding graph and enhance the feature representation of the target in the network. Finally, a lightweight segmentation head is designed to achieve progressive feature fusion with multi-layer outputs. Experimental results show that the proposed algorithm has good real-time performance and high accuracy in drone target detection. On the public dataset, the intersection over union (IOU) is improved by 2.5% and the detection time is reduced by 81%.

Original languageEnglish
Article number424
JournalDrones
Volume7
Issue number7
DOIs
Publication statusPublished - Jul 2023

Keywords

  • background clutter
  • counter-drones
  • lightweight network
  • progressive fusion

Fingerprint

Dive into the research topics of 'PFFNET: A Fast Progressive Feature Fusion Network for Detecting Drones in Infrared Images'. Together they form a unique fingerprint.

Cite this