An Integrated Navigation Method Based on the Strapdown Inertial Navigation System/Scene-Matching Navigation System for UAVs

Yukun Wang, Qiang Wang*, Zhonghu Hao, Puhua Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

To address the challenges of discontinuous heterogeneous image matching, significant matching errors in specific regions, and poor real-time performance in GNSS-denied environments for unmanned aerial vehicles (UAVs), we propose an integrated navigation method based on the strapdown inertial navigation system (SINS)/scene-matching navigation system (SMNS). First, we designed a heterogeneous image-matching and positioning approach using infrared images to obtain an estimation of the UAV’s position. Then, we established a mathematical model for the integrated SINS/SMNS navigation system. Finally, a Kalman filter (KF) was employed to fuse the inertial navigation data with absolute position data from scene matching, achieving high-precision and highly reliable navigation positioning. We constructed a navigation data acquisition platform and conducted simulation studies using flight data collected from this platform. The results demonstrate that the integrated SINS/SMNS navigation method significantly outperforms standalone scene-matching navigation in horizontal positioning accuracy, improving latitude accuracy by 52.34% and longitude accuracy by 45.54%.

Original languageEnglish
Article number3379
JournalSensors
Volume25
Issue number11
DOIs
Publication statusPublished - Jun 2025
Externally publishedYes

Keywords

  • integrated navigation system
  • KF
  • SINS
  • SMNS
  • UAV

Fingerprint

Dive into the research topics of 'An Integrated Navigation Method Based on the Strapdown Inertial Navigation System/Scene-Matching Navigation System for UAVs'. Together they form a unique fingerprint.

Cite this