Event-Triggered Asynchronous Fusion Localization for an Autonomous Mobile Robot With a Camera and UWB in Confined Indoor Spaces

  • Xinyue Cao
  • , Hongjiu Yang*
  • , Yuanqing Xia
  • , Li Li
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Accurate indoor localization for autonomous mobile robots remains challenging in confined indoor spaces using a camera and an ultra-wideband (UWB) due to asynchronous sampling, heavy-tailed noises and missing measurements. In this article, event-triggered asynchronous fusion localization is investigated for an autonomous mobile robot using a camera and an UWB in confined indoor spaces. A dynamic event-triggered mechanism is proposed to save communication resource by reducing measurement transmission. A compensation strategy is designed to recover non-triggered measurements for local estimation. An UWB local estimator is designed addressing heavy-tailed noises and uncertainties by measurement compensation. A visual local estimator is designed under missing measurement compensation using fusion estimate feedback. A distributed fusion estimator is proposed to achieve fusion estimates by a matrix-weighted fusion method. Effectiveness of the proposed event-triggered asynchronous fusion localization is demonstrated through experimental results in an indoor narrow corridor.

Original languageEnglish
JournalIEEE Transactions on Industrial Electronics
DOIs
Publication statusAccepted/In press - 2026
Externally publishedYes

Keywords

  • Asynchronous fusion
  • confined spaces
  • event-triggered mechanism
  • indoor localization
  • mobile robots

Fingerprint

Dive into the research topics of 'Event-Triggered Asynchronous Fusion Localization for an Autonomous Mobile Robot With a Camera and UWB in Confined Indoor Spaces'. Together they form a unique fingerprint.

Cite this