Abstract
Video white balance is to correct the scene color of video frames to the color under the standard white illumination. Due to the camera movement, video white balance usually suffers temporal instability with unnatural color change between frames. This paper presents a video white balance stabilization method for spatially correct and temporally stable color correction. It exploits the color invariance at the position of the same object to obtain the consistent illumination color estimation through frames. Specifically, it detects gray pixels that inherit the potential illumination color, and their inter-frame motion calculated with the assistance of inertial measurement unit (IMU) is used to carry gray pixels for establishing their correspondence and color fusion between adjacent frames. Because the IMU has more robust and accurate motion cues against large camera movement and texture-less regions in the scene, our method can generate better gray pixel correspondences and illumination color estimation for the white balance stabilization. Besides, our method is computationally efficient to be deployed on mobile phones. Experimental results show that our method can significantly improve the temporal stability as well as maintain the spatial correctness of white balance for videos recorded by cameras equipped with IMU sensors.
Original language | English |
---|---|
Journal | IEEE Transactions on Multimedia |
DOIs | |
Publication status | Accepted/In press - 2025 |
Keywords
- color stabilization
- gray pixel
- illumination estimation
- inertial measurement unit
- video white balance