摘要
Multi-robot simultaneous localization and mapping (MR-SLAM) is of great importance for enhancing the efficiency of large-scale environment exploration. Despite remarkable advances in schemes for cooperation, there is a critical lack of approaches to handle multiple uncertainties inherent to MR-SLAM in large-scale environments. This paper proposes a multi-uncertainty captured multi-robot lidar odometry and mapping (MUC-LOAM) framework, to quantify and utilize the uncertainties of feature points and robot mutual poses in large-scale environments. A proposed hybrid weighting strategy for pose update is integrated into MUC-LOAM to handle feature uncertainty from distance changing and dynamic objects. A devised Bayesian Neural Network (BNN) is proposed to capture mutual pose uncertainty. Then the covariance propagation of quaternions to Euler angles conversion is leveraged to filter out unreliable mutual poses. Another covariance propagation through coordinate transformations in nonlinear optimization improves the accuracy of map merging. The feasibility and enhanced robustness of the proposed framework for large-scale exploration are validated on both public datasets and real-world experiments.
源语言 | 英语 |
---|---|
页(从-至) | 143-157 |
页数 | 15 |
期刊 | Unmanned Systems |
卷 | 11 |
期 | 2 |
DOI | |
出版状态 | 已出版 - 1 4月 2023 |