AeroNet: An efficient relative localization and object detection network for cooperative aerial-ground unmanned vehicles

Kai Shen*, Yu Zhuang, Yixuan Chen, Siqi Zuo, Tong Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

This paper proposes an efficient relative localization and object detection network (AeroNet) based on incremental learning for minimalistic high-speed cooperative navigation of aerial-ground unmanned vehicles in cluttered environments. Due to highly limited computation capability and memory resources in micro-UAVs, YOLO series are applied as the baseline of object detection network, and a lightweight backbone is built based on the depthwise separable convolution. To improve the real-time performance, the detection head is formulated with broad learning system. Besides, 6D relative pose estimation is achieved via equation fitting of an elliptical cooperative mark. To verify the effectiveness of AeroNet, experiments are conducted on Intel NUC and NVIDIA TX2 with our self-collected dataset. Results show that AeroNet can progressively increase the accuracy of object detection to 89%, and the computational time is only 76ms on Intel NUC and 28ms on Nvidia TX2, respectively, which meet the need of real-time requirement of on-board calculation in micro-UAV avionics systems.

Original languageEnglish
Pages (from-to)28-37
Number of pages10
JournalPattern Recognition Letters
Volume171
DOIs
Publication statusPublished - Jul 2023
Externally publishedYes

Keywords

  • Cooperative navigation
  • Incremental learning
  • Object detection
  • Relative localization

Fingerprint

Dive into the research topics of 'AeroNet: An efficient relative localization and object detection network for cooperative aerial-ground unmanned vehicles'. Together they form a unique fingerprint.

Cite this