TY - JOUR
T1 - EventGAN
T2 - An Unsupervised Low-Light Grayscale Image Enhancement Method Based on Event Camera
AU - Wu, Zehao
AU - Xia, Yuanqing
AU - Hu, Rui
AU - Gao, Runze
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Event cameras, as a novel class of bioinspired vision sensors, are ideal for low-light enhancement thanks to their high dynamic range (HDR) characteristics. However, two critical challenges emerge when applying event camera data to low-light image enhancement (LIME) tasks: ineffective fusion between conventional images and event data streams, and the lack of paired training data. To address these problems, in this article, we propose the EventGAN, an eventassisted unsupervised low-light enhancement method. First, we propose a modified image-to-event simulation method that transforms low-light-enhanced images into event representations, enabling effective joint processing of images and event data. Second, we design an event similarity loss function that establishes mappings between input event images and simulated event images, eliminating the need for paired training data. Finally, we introduce a local discriminator designed to suppress regional overexposure or underexposure artifacts, thereby significantly improving the effectiveness of LIME. Experimental results conclusively demonstrate that our method outperforms current LIME techniques. As a sensor signal processing approach for extreme lighting, it not only solves key challenges in using event cameras for low-light enhancement but also produces clearer, more natural, and higher quality images in very dark conditions.
AB - Event cameras, as a novel class of bioinspired vision sensors, are ideal for low-light enhancement thanks to their high dynamic range (HDR) characteristics. However, two critical challenges emerge when applying event camera data to low-light image enhancement (LIME) tasks: ineffective fusion between conventional images and event data streams, and the lack of paired training data. To address these problems, in this article, we propose the EventGAN, an eventassisted unsupervised low-light enhancement method. First, we propose a modified image-to-event simulation method that transforms low-light-enhanced images into event representations, enabling effective joint processing of images and event data. Second, we design an event similarity loss function that establishes mappings between input event images and simulated event images, eliminating the need for paired training data. Finally, we introduce a local discriminator designed to suppress regional overexposure or underexposure artifacts, thereby significantly improving the effectiveness of LIME. Experimental results conclusively demonstrate that our method outperforms current LIME techniques. As a sensor signal processing approach for extreme lighting, it not only solves key challenges in using event cameras for low-light enhancement but also produces clearer, more natural, and higher quality images in very dark conditions.
KW - Event camera
KW - low-light image enhancement (LIME)
KW - unsupervised learning
UR - https://www.scopus.com/pages/publications/105020044009
U2 - 10.1109/JSEN.2025.3622656
DO - 10.1109/JSEN.2025.3622656
M3 - Article
AN - SCOPUS:105020044009
SN - 1530-437X
VL - 25
SP - 42871
EP - 42880
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 23
ER -