Airborne Vision Based Target Motion State Estimation for UAV Aerial Docking

Ruoxuan Li*, Shaoming He, Tao Song, Hong Tao

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper proposes a new target motion state estimation method for UAV aerial refueling and autonomous recovery. First, airborne vision measurement, airborne electro‐optical pod, and GPS information are fused to formulate a feasible target estimation model. The problems of unexpected outliers and inconsistent update frequency appearing in these different measurements are alleviated by leveraging multi-model and multi-rate extended Kalman filter algorithm, which can autonomously and optimally select and update the measurement model. Numerous flight experiments demonstrate that the proposed target estimation method has satisfactory accuracy and high robustness in complex interference environment.

Original languageEnglish
Title of host publicationAdvances in Guidance, Navigation and Control - Proceedings of 2022 International Conference on Guidance, Navigation and Control
EditorsLiang Yan, Haibin Duan, Yimin Deng, Liang Yan
PublisherSpringer Science and Business Media Deutschland GmbH
Pages4384-4395
Number of pages12
ISBN (Print)9789811966125
DOIs
Publication statusPublished - 2023
EventInternational Conference on Guidance, Navigation and Control, ICGNC 2022 - Harbin, China
Duration: 5 Aug 20227 Aug 2022

Publication series

NameLecture Notes in Electrical Engineering
Volume845 LNEE
ISSN (Print)1876-1100
ISSN (Electronic)1876-1119

Conference

ConferenceInternational Conference on Guidance, Navigation and Control, ICGNC 2022
Country/TerritoryChina
CityHarbin
Period5/08/227/08/22

Keywords

  • Aerial docking
  • Airborne vision
  • GPS
  • Multi-model multi-rate Kalman filter
  • Target state estimation

Fingerprint

Dive into the research topics of 'Airborne Vision Based Target Motion State Estimation for UAV Aerial Docking'. Together they form a unique fingerprint.

Cite this