TY - JOUR
T1 - Multi-Agent Visual-Inertial Localization for Integrated Aerial Systems with Loose Fusion of Odometry and Kinematics
AU - Lai, Ganghua
AU - Shi, Chuanbeibei
AU - Wang, Kaidi
AU - Yu, Yushu
AU - Dong, Yiqun
AU - Franchi, Antonio
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2024/7/1
Y1 - 2024/7/1
N2 - Reliablyand efficiently estimating the relative pose and global localization of robots in a common reference for Integrated Aerial Platforms (IAPs) is a challenging problem. Unlike unmanned aerial vehicle (UAV) swarms, where the agent individual is able to move freely, IAPs connect UAV agents with mechanical joints, such as spherical joints, and form a rigid central platform, limiting the degree of freedom (DOF) of agents. Traditional methods, which rely on forming loop closures, object detection, or range sensors, suffer from degeneration or inefficiency due to the restricted relative motion between agents. In this paper, we present a centralized multi-agent localization system that fuses the internal kinematic constraints of IAPs and odometry measurements, using only visual-inertial suits for ego-motion estimation for agents and an additional 9-DOF Inertial Measurement Unit (IMU) attached to the central platform for posture estimation. A general formulation for kinematic constraints is derived without requiring knowledge about detailed kinematic parameters. A sliding-window optimization-based state estimator is constructed to estimate the relative transformation between agents. Our proposed approach is validated in our collected dataset. The results show that the proposed method reduces the global localization drift by 27.15% and relative localization error by 53.4% in the translation part and 36.99% in the rotation part compared to the baseline.
AB - Reliablyand efficiently estimating the relative pose and global localization of robots in a common reference for Integrated Aerial Platforms (IAPs) is a challenging problem. Unlike unmanned aerial vehicle (UAV) swarms, where the agent individual is able to move freely, IAPs connect UAV agents with mechanical joints, such as spherical joints, and form a rigid central platform, limiting the degree of freedom (DOF) of agents. Traditional methods, which rely on forming loop closures, object detection, or range sensors, suffer from degeneration or inefficiency due to the restricted relative motion between agents. In this paper, we present a centralized multi-agent localization system that fuses the internal kinematic constraints of IAPs and odometry measurements, using only visual-inertial suits for ego-motion estimation for agents and an additional 9-DOF Inertial Measurement Unit (IMU) attached to the central platform for posture estimation. A general formulation for kinematic constraints is derived without requiring knowledge about detailed kinematic parameters. A sliding-window optimization-based state estimator is constructed to estimate the relative transformation between agents. Our proposed approach is validated in our collected dataset. The results show that the proposed method reduces the global localization drift by 27.15% and relative localization error by 53.4% in the translation part and 36.99% in the rotation part compared to the baseline.
KW - Aerial Systems: applications
KW - localization
KW - multi-robot SLAM
UR - http://www.scopus.com/inward/record.url?scp=85194852704&partnerID=8YFLogxK
U2 - 10.1109/LRA.2024.3407579
DO - 10.1109/LRA.2024.3407579
M3 - Article
AN - SCOPUS:85194852704
SN - 2377-3766
VL - 9
SP - 6504
EP - 6511
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 7
ER -