Asynchronous Event-Based Corner Detection Using Adaptive Time Threshold

Jinjian Li, Li Su, Chuandong Guo, Xiangyu Wang, Quan Hu*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

7 引用 (Scopus)

摘要

Event cameras are novel neuromorphic vision sensors with ultrahigh temporal resolution and low latency, both in the order of microseconds. Instead of image frames, event cameras generate an asynchronous event stream of per-pixel intensity changes with precise timestamps. The resulting sparse data structure impedes applying many conventional computer vision techniques to event streams, and specific algorithms should be designed to leverage the information provided by event cameras. In our work, a motion-and scene-adaptive time threshold for event data is proposed. As a parameter describing the global characteristic of the event stream, this time threshold can be used for low-level visual tasks, such as event denoising and feature extraction. Based on this threshold, the normalization method for the surface of active events (SAE) is explored from a new perspective. Different from the previous speed-invariant time surface, this normalized SAE is constructed by an adaptive exponential decay (AED-SAE) and can be directly applied to the event-based Harris corner detector. The proposed corner detector is evaluated on real and synthetic datasets with different resolutions. The proposed algorithm exhibits higher accuracy than congeneric algorithms and maintains high computational efficiency on datasets with different resolutions and texture levels.

源语言英语
页(从-至)9512-9522
页数11
期刊IEEE Sensors Journal
23
9
DOI
出版状态已出版 - 1 5月 2023

指纹

探究 'Asynchronous Event-Based Corner Detection Using Adaptive Time Threshold' 的科研主题。它们共同构成独一无二的指纹。

引用此