Visual Navigation Algorithms for Aircraft Fusing Neural Networks in Denial Environments

Yang Gao, Yue Wang*, Lingyun Tian, Dongguang Li, Fenming Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

A lightweight aircraft visual navigation algorithm that fuses neural networks is proposed to address the limited computing power issue during the offline operation of aircraft edge computing platforms in satellite-denied environments with complex working scenarios. This algorithm utilizes object detection algorithms to label dynamic objects within complex scenes and performs dynamic feature point elimination to enhance the feature point extraction quality, thereby improving navigation accuracy. The algorithm was validated using an aircraft edge computing platform, and comparisons were made with existing methods through experiments conducted on the TUM public dataset and physical flight experiments. The experimental results show that the proposed algorithm not only improves the navigation accuracy but also has high robustness compared with the monocular ORB-SLAM2 method under the premise of satisfying the real-time operation of the system.

Original languageEnglish
Article number4797
JournalSensors
Volume24
Issue number15
DOIs
Publication statusPublished - Aug 2024

Keywords

  • UAV
  • denial environment
  • visual navigation

Fingerprint

Dive into the research topics of 'Visual Navigation Algorithms for Aircraft Fusing Neural Networks in Denial Environments'. Together they form a unique fingerprint.

Cite this