TY - JOUR
T1 - Reliable Event Generation with Invertible Conditional Normalizing Flow
AU - Gu, Daxin
AU - Li, Jia
AU - Zhu, Lin
AU - Zhang, Yu
AU - Ren, Jimmy S.
N1 - Publisher Copyright:
© 1979-2012 IEEE.
PY - 2024/2/1
Y1 - 2024/2/1
N2 - Event streams provide a novel paradigm to describe visual scenes by capturing intensity variations above specific thresholds along with various types of noise. Existing event generation methods usually rely on one-way mappings using hand-crafted parameters and noise rates, which may not adequately suit diverse scenarios and event cameras. To address this limitation, we propose a novel approach to learn a bidirectional mapping between the feature space of event streams and their inherent parameters, enabling the generation of reliable event streams with enhanced generalization capabilities. We first randomly generate a vast number of parameters and synthesize massive event streams using an event simulator. Subsequently, an event-based normalizing flow network is proposed to learn the invertible mapping between the representation of a synthetic event stream and its parameters. The invertible mapping is implemented by incorporating an intensity-guided conditional affine simulation mechanism, facilitating better alignment between event features and parameter spaces. Additionally, we impose constraints on event sparsity, edge distribution, and noise distribution through novel event losses, further emphasizing event priors in the bidirectional mapping. Our framework surpasses state-of-the-art methods in video reconstruction, optical flow estimation, and parameter estimation tasks on synthetic and real-world datasets, exhibiting excellent generalization across diverse scenes and cameras.
AB - Event streams provide a novel paradigm to describe visual scenes by capturing intensity variations above specific thresholds along with various types of noise. Existing event generation methods usually rely on one-way mappings using hand-crafted parameters and noise rates, which may not adequately suit diverse scenarios and event cameras. To address this limitation, we propose a novel approach to learn a bidirectional mapping between the feature space of event streams and their inherent parameters, enabling the generation of reliable event streams with enhanced generalization capabilities. We first randomly generate a vast number of parameters and synthesize massive event streams using an event simulator. Subsequently, an event-based normalizing flow network is proposed to learn the invertible mapping between the representation of a synthetic event stream and its parameters. The invertible mapping is implemented by incorporating an intensity-guided conditional affine simulation mechanism, facilitating better alignment between event features and parameter spaces. Additionally, we impose constraints on event sparsity, edge distribution, and noise distribution through novel event losses, further emphasizing event priors in the bidirectional mapping. Our framework surpasses state-of-the-art methods in video reconstruction, optical flow estimation, and parameter estimation tasks on synthetic and real-world datasets, exhibiting excellent generalization across diverse scenes and cameras.
KW - Conditional normalizing flow
KW - contrast threshold
KW - event camera
KW - event generation
KW - event noise rate
UR - http://www.scopus.com/inward/record.url?scp=85176296918&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2023.3326538
DO - 10.1109/TPAMI.2023.3326538
M3 - Article
C2 - 37871096
AN - SCOPUS:85176296918
SN - 0162-8828
VL - 46
SP - 927
EP - 943
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 2
ER -