Asynchronous Event-Based Corner Detection Using Adaptive Time Threshold

Jinjian Li, Li Su, Chuandong Guo, Xiangyu Wang, Quan Hu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

Event cameras are novel neuromorphic vision sensors with ultrahigh temporal resolution and low latency, both in the order of microseconds. Instead of image frames, event cameras generate an asynchronous event stream of per-pixel intensity changes with precise timestamps. The resulting sparse data structure impedes applying many conventional computer vision techniques to event streams, and specific algorithms should be designed to leverage the information provided by event cameras. In our work, a motion-and scene-adaptive time threshold for event data is proposed. As a parameter describing the global characteristic of the event stream, this time threshold can be used for low-level visual tasks, such as event denoising and feature extraction. Based on this threshold, the normalization method for the surface of active events (SAE) is explored from a new perspective. Different from the previous speed-invariant time surface, this normalized SAE is constructed by an adaptive exponential decay (AED-SAE) and can be directly applied to the event-based Harris corner detector. The proposed corner detector is evaluated on real and synthetic datasets with different resolutions. The proposed algorithm exhibits higher accuracy than congeneric algorithms and maintains high computational efficiency on datasets with different resolutions and texture levels.

Original languageEnglish
Pages (from-to)9512-9522
Number of pages11
JournalIEEE Sensors Journal
Volume23
Issue number9
DOIs
Publication statusPublished - 1 May 2023

Keywords

  • Adaptive time threshold
  • corner detection
  • event camera
  • event datasets

Fingerprint

Dive into the research topics of 'Asynchronous Event-Based Corner Detection Using Adaptive Time Threshold'. Together they form a unique fingerprint.

Cite this