A Visual Feature Mismatch Detection Algorithm for Optical Flow-Based Visual Odometry

Ruichen Li, Han Shen*, Linan Wang, Congyi Liu, Xiaojian Yi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Camera-based visual simultaneous localization and mapping (VSLAM) algorithms involve extracting and tracking feature points in their front-ends. Feature points are subsequently forwarded to the back-end for camera pose estimation. However, the matching results of these feature points by optical flow are prone to visual feature mismatches. To address the mentioned problems, this paper introduces a novel visual feature mismatch detection algorithm. First, the algorithm calculates pixel displacements for all feature point pairs tracked by the optical flow method between consecutive images. Subsequently, mismatches are detected based on the pixel displacement threshold calculated by the statistical characteristics of tracking results. Additionally, bound values for the threshold are set to enhance the accuracy of the filtered matches, ensuring its adaptability to different environments. Following the filtered matches, the algorithm calculates the fundamental matrix, which is then used to further refine the filtered matches sent to the back-end for camera pose estimation. The algorithm is seamlessly integrated into the state-of-the-art VSLAM system, enhancing the overall robustness of VSLAM. Extensive experiments conducted on both public datasets and our unmanned surface vehicles (USVs) validate the performance of the proposed algorithm.

Original languageEnglish
JournalUnmanned Systems
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • mismatch detection
  • Optical flow
  • visual odometry (VO)
  • visual simultaneous localization and mapping (VSLAM)

Fingerprint

Dive into the research topics of 'A Visual Feature Mismatch Detection Algorithm for Optical Flow-Based Visual Odometry'. Together they form a unique fingerprint.

Cite this