摘要
Event cameras are novel neuromorphic vision sensors with ultrahigh temporal resolution and low latency, both in the order of microseconds. Instead of image frames, event cameras generate an asynchronous event stream of per-pixel intensity changes with precise timestamps. The resulting sparse data structure impedes applying many conventional computer vision techniques to event streams, and specific algorithms should be designed to leverage the information provided by event cameras. In our work, a motion-and scene-adaptive time threshold for event data is proposed. As a parameter describing the global characteristic of the event stream, this time threshold can be used for low-level visual tasks, such as event denoising and feature extraction. Based on this threshold, the normalization method for the surface of active events (SAE) is explored from a new perspective. Different from the previous speed-invariant time surface, this normalized SAE is constructed by an adaptive exponential decay (AED-SAE) and can be directly applied to the event-based Harris corner detector. The proposed corner detector is evaluated on real and synthetic datasets with different resolutions. The proposed algorithm exhibits higher accuracy than congeneric algorithms and maintains high computational efficiency on datasets with different resolutions and texture levels.
源语言 | 英语 |
---|---|
页(从-至) | 9512-9522 |
页数 | 11 |
期刊 | IEEE Sensors Journal |
卷 | 23 |
期 | 9 |
DOI | |
出版状态 | 已出版 - 1 5月 2023 |