A Dynamic Calibration Framework for the Event-Frame Stereo Camera System

Rui Hu, Jürgen Kogler, Margrit Gelautz, Min Lin, Yuanqing Xia*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The fusion of event cameras and conventional frame cameras is a novel research field, and a stereo structure consisting of an event camera and a frame camera can incorporate the advantages of both. This letter develops a dynamic calibration framework for the event-frame stereo camera system. In this framework, the first step is to complete the initial detection on a circle-grid calibration pattern, and a sliding-window time matching method is proposed to match the event-frame pairs. Then, a refining method is devised for two cameras to get the accurate information of the pattern. Particularly, for the event camera, a patch-size motion compensation method with high computational efficiency is designed to achieve time synchronization for two cameras and fit circles in an image of warped events. Finally, the pose between two cameras is globally optimized by constructing a pose-landmark graph with two types of edges. The proposed calibration framework has the advantages of high real-time performance and easy deployment, and its effectiveness is verified by experiments based on self-recorded datasets.

Original languageEnglish
Pages (from-to)11465-11472
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume9
Issue number12
DOIs
Publication statusPublished - 2024

Keywords

  • Calibration and identification
  • event cameras
  • event-frame stereo camera system
  • sensor fusion

Fingerprint

Dive into the research topics of 'A Dynamic Calibration Framework for the Event-Frame Stereo Camera System'. Together they form a unique fingerprint.

Cite this