Adaptive Covariance Matrix based on Blur Evaluation for Visual-Inertial Navigation

Yi Fan Zuo, Changda Yan, Qiwei Liu, Xia Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The covariance matrix in the current mainstream visual-inertial navigation system is artificially set and the weight of visual information cannot be adjusted by different blur degree, which cause the poor accuracy and robustness in the whole system. In order to solve this problem, this paper proposed a navigation scheme based on adaptive covariance matrix. This method used the Laplacian operator to evaluate the blur degree of image by a score. And then the visual covariance matrix is adjusted according to the different scores, which can adjust the weight in the fusion system according to the image quality. By doing this, the algorithm can improve the accuracy of the system. The simulation results show that the proposed method can effectively improve the system accuracy. Compared with the traditional method, the proposed algorithm has stronger robustness when motion blur occur.

Original languageEnglish
Title of host publicationIPMV 2022 - 2022 4th International Conference on Image Processing and Machine Vision
PublisherAssociation for Computing Machinery
Pages94-101
Number of pages8
ISBN (Electronic)9781450395823
DOIs
Publication statusPublished - 25 Mar 2022
Event4th International Conference on Image Processing and Machine Vision, IPMV 2022 - Virtual, Online, China
Duration: 25 Mar 202227 Mar 2022

Publication series

NameACM International Conference Proceeding Series

Conference

Conference4th International Conference on Image Processing and Machine Vision, IPMV 2022
Country/TerritoryChina
CityVirtual, Online
Period25/03/2227/03/22

Keywords

  • Visual-inertial navigation system
  • covariance matrix
  • motion blur

Fingerprint

Dive into the research topics of 'Adaptive Covariance Matrix based on Blur Evaluation for Visual-Inertial Navigation'. Together they form a unique fingerprint.

Cite this