Abstract
Given the payload limitation of unmanned aerial vehicles (UAVs), lightweight sensors such as camera, inertial measurement unit (IMU), and GPS, are ideal onboard measurement devices. By fusing multiple sensors, accurate state estimations can be achieved. Robustness against sensor faults is also possible because of redundancy. However, scale estimation of visual systems (visual odometry or visual inertial odometry, VO/VIO) suffers from sensor noise and special-case movements such as uniform linear motion. Thus, in this paper, a scale insensitive multi-sensor fusion (SIMSF) framework based on graph optimization is proposed. This framework combines the local estimation of the VO/VIO and global sensors to infer the accurate global state estimation of UAVs in real time. A similarity transformation between the local frame of the VO/VIO and the global frame is estimated by optimizing the poses of the most recent UAV states. In particular, for VO, an initial scale is estimated by aligning the VO with the IMU and GPS measurements. Moreover, a fault detection method for VO/VIO is also proposed to enhance the robustness of the fusion framework. The proposed methods are tested on a UAV platform and evaluated in several challenging environments. A comparison between our results and the results from other state-of-the-art algorithms demonstrate the superior accuracy, robustness, and real-time performance of our system. Our work is also a general fusion framework, which can be extended to other platforms as well.
Original language | English |
---|---|
Article number | 9110577 |
Pages (from-to) | 118273-118284 |
Number of pages | 12 |
Journal | IEEE Access |
Volume | 8 |
DOIs | |
Publication status | Published - 2020 |
Externally published | Yes |
Keywords
- Multi-sensor fusion
- fusion framework
- graph optimization
- scale insensitive
- state estimation
- unmanned aerial vehicle