Absolute pose estimation of UAV based on large-scale satellite image

Hanyu WANG, Qiang SHEN*, Zilong DENG, Xinyi CAO, Xiaokang Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Obtaining absolute pose based on pre-loaded satellite images is one of the important means of autonomous navigation for small Unmanned Aerial Vehicles (UAVs) in Global Navigation Satellite System (GNSS) denied environments. Most of the previous works have tended to build Convolutional Neural Networks (CNNs) to extract features and then directly regress the pose, which will fail when solving the challenges caused by the huge viewpoint and size differences between “UAV-satellite” image pairs in real-world scenarios. Therefore, this paper proposes a probability distribution/regression integrated deep model with the attention-guided triple fusion mechanism, which estimates discrete distributions in pose space and three-dimensional vectors in translation space. In order to overcome the shortage of the relevant dataset, this paper simulates image datasets captured by UAVs with forward-facing cameras during target searching and autonomous attacking. The effectiveness, superiority, and robustness of the proposed method are verified by simulated datasets and flight tests.

Original languageEnglish
Pages (from-to)219-231
Number of pages13
JournalChinese Journal of Aeronautics
Volume37
Issue number6
DOIs
Publication statusPublished - Jun 2024

Keywords

  • Deep neural networks
  • Satellite imagery
  • Unmanned Aerial Vehicle (UAV)
  • Vision navigation
  • von Mises-Fisher distribution

Fingerprint

Dive into the research topics of 'Absolute pose estimation of UAV based on large-scale satellite image'. Together they form a unique fingerprint.

Cite this