TY - JOUR
T1 - Inertial-visual Collaborative Navigation Method for Master-slave Multi-lunar-based Equipment
AU - Lu, Siqi
AU - Ge, Dantong
AU - Xu, Rui
AU - Zhu, Shengying
N1 - Publisher Copyright:
Copyright © 2023 by the International Astronautical Federation (IAF). All rights reserved.
PY - 2023
Y1 - 2023
N2 - This paper presents inertial-visual collaborative navigation method for master-slave multi-lunar-based equipment. Firstly, in view of the large uncertainty of the lunar environment, a multi-machine collaborative SLAM localization technology is proposed, which can enable the lunar equipment to build an environmental model and estimate its own motion without prior environmental information. Secondly, the global image operator is introduced to identify the common view area of the multi-lunar-based equipment, realize the information association among the multi-lunar-based equipment, and optimize the robustness of the visual navigation system to environmental changes such as illumination. Finally, the motion state information between the equipment is exchanged by radiocommunication, and the measured value is filtered and fused with the information obtained by inertial navigation system. The joint filtering model is used to simulate the information fusion process of the lunar-based equipment according to the weight of the allocation factor, which makes the target positioning more precision. The simulation results show that the proposed algorithm can further restrain the divergence of positioning errors and improve the positioning accuracy of the lunar-based equipment, which verifies the effectiveness of the proposed algorithm.
AB - This paper presents inertial-visual collaborative navigation method for master-slave multi-lunar-based equipment. Firstly, in view of the large uncertainty of the lunar environment, a multi-machine collaborative SLAM localization technology is proposed, which can enable the lunar equipment to build an environmental model and estimate its own motion without prior environmental information. Secondly, the global image operator is introduced to identify the common view area of the multi-lunar-based equipment, realize the information association among the multi-lunar-based equipment, and optimize the robustness of the visual navigation system to environmental changes such as illumination. Finally, the motion state information between the equipment is exchanged by radiocommunication, and the measured value is filtered and fused with the information obtained by inertial navigation system. The joint filtering model is used to simulate the information fusion process of the lunar-based equipment according to the weight of the allocation factor, which makes the target positioning more precision. The simulation results show that the proposed algorithm can further restrain the divergence of positioning errors and improve the positioning accuracy of the lunar-based equipment, which verifies the effectiveness of the proposed algorithm.
KW - Collaborative navigation
KW - Federated filter
KW - Lunar-based equipment
KW - SLAM
UR - http://www.scopus.com/inward/record.url?scp=85187997035&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85187997035
SN - 0074-1795
VL - 2023-October
JO - Proceedings of the International Astronautical Congress, IAC
JF - Proceedings of the International Astronautical Congress, IAC
T2 - 74th International Astronautical Congress, IAC 2023
Y2 - 2 October 2023 through 6 October 2023
ER -