Cooperative Visual-Range-Inertial Navigation for Multiple Unmanned Aerial Vehicles

Chunyu Li, Jianan Wang*, Junhui Liu, Jiayuan Shan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

In this article, the cooperative navigation issue is investigated for a group of unmanned aerial vehicles (UAVs). A distributed estimation architecture fusing range, visual, and intermittent position measurements is proposed. Relative range and co-observed features are utilized to construct direct and indirect geometric constraints between UAVs, respectively. Compared with its independent counterpart, the proposed collaborative estimation scheme is more accurate and robust, while maintaining scalability and efficiency in practical deployment. To solve the intractable problem of evaluating the cross covariance between local estimators during estimation, the covariance intersection (CI) algorithm is introduced into the distributed fusion scheme, where each UAV only estimates its own pose and covariance. Observability analysis is provided to gain insights about the system's identification properties. Finally, the algorithm is applied to a practical patrolling scenario of multiple UAVs, and both numerical and software-in-the-loop (SITL) simulations are performed to illustrate the feasibility and effectiveness of the proposed scheme.

Original languageEnglish
Pages (from-to)7851-7865
Number of pages15
JournalIEEE Transactions on Aerospace and Electronic Systems
Volume59
Issue number6
DOIs
Publication statusPublished - 1 Dec 2023

Keywords

  • Aerial systems
  • cooperative localization
  • multi unmanned aerial vehicle (UAV) systems
  • ultra-wideband (UWB)
  • visual-inertial navigation

Fingerprint

Dive into the research topics of 'Cooperative Visual-Range-Inertial Navigation for Multiple Unmanned Aerial Vehicles'. Together they form a unique fingerprint.

Cite this