TY - JOUR
T1 - LM-VINS
T2 - A Robust and Computationally Efficient Visual-Inertial Navigation System for Loitering Munitions
AU - Gao, Zhiming
AU - Jiang, Jiaqi
AU - Li, Chunyu
AU - Liu, Junhui
AU - Wang, Jianan
AU - Shan, Jiayuan
N1 - Publisher Copyright:
© 1965-2011 IEEE.
PY - 2025
Y1 - 2025
N2 - Visual-inertial navigation systems (VINS) are crucial for robot navigation in GPS-denied environments. However, most existing methods are developed for general-purpose robotics and are unsuitable for high-speed, resource-constrained loitering munitions. To address these limitations, we propose LM-VINS, a robust and efficient visual-inertial navigation system for high-speed loitering munitions. First, a hybrid feature-tracking strategy is designed that balances accuracy and efficiency by using learning-based feature tracking for keyframes and optical flow tracking for non-keyframes. Then, a bio-inspired dual-memory system is designed for place recognition, so as to mitigate long-term drift through reward-driven priority ranking and generative replay. Finally, these two components are integrated into a tightly-coupled optimization framework to refine the pose estimation of the loitering munition. To evaluate our proposed approach, we construct LM-1600, the first visual-inertial dataset specifically for loitering munitions. Experimental results show that the system achieves an absolute trajectory error (ATE) of 11.971 m, representing a 31.7% improvement over AirVO, while operating at only 40% CPU utilization on an Intel Core i9-13900H processor. The proposed VINS framework paves the way for the enhancement of the autonomy and reliability of loitering munitions in GPS-denied environments and the enabling of more robust operations in complex and dynamic scenarios.
AB - Visual-inertial navigation systems (VINS) are crucial for robot navigation in GPS-denied environments. However, most existing methods are developed for general-purpose robotics and are unsuitable for high-speed, resource-constrained loitering munitions. To address these limitations, we propose LM-VINS, a robust and efficient visual-inertial navigation system for high-speed loitering munitions. First, a hybrid feature-tracking strategy is designed that balances accuracy and efficiency by using learning-based feature tracking for keyframes and optical flow tracking for non-keyframes. Then, a bio-inspired dual-memory system is designed for place recognition, so as to mitigate long-term drift through reward-driven priority ranking and generative replay. Finally, these two components are integrated into a tightly-coupled optimization framework to refine the pose estimation of the loitering munition. To evaluate our proposed approach, we construct LM-1600, the first visual-inertial dataset specifically for loitering munitions. Experimental results show that the system achieves an absolute trajectory error (ATE) of 11.971 m, representing a 31.7% improvement over AirVO, while operating at only 40% CPU utilization on an Intel Core i9-13900H processor. The proposed VINS framework paves the way for the enhancement of the autonomy and reliability of loitering munitions in GPS-denied environments and the enabling of more robust operations in complex and dynamic scenarios.
KW - Bio-Inspired place recognition
KW - feature tracking
KW - large-scale long-term scenarios
KW - loitering munition
KW - visual-inertial navigation
UR - https://www.scopus.com/pages/publications/105022725276
U2 - 10.1109/TAES.2025.3635112
DO - 10.1109/TAES.2025.3635112
M3 - Article
AN - SCOPUS:105022725276
SN - 0018-9251
JO - IEEE Transactions on Aerospace and Electronic Systems
JF - IEEE Transactions on Aerospace and Electronic Systems
ER -