Single image dehazing using a novel criterion based segmenting dark channel prior

Chao Zhang, Yanjun Zhang, Jihua Lu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Single image dehazing has become a hot issue in image processing. Almost all the dehazing algorithms suffer from two problems: bringing out the halo artifacts or high computational complexity. In this paper, we propose a novel single image dehazing algorithm with a modified criterion based segmenting dark channel prior (SDCP), which is simple and effective. Based on the estimated global atmospheric light, the transmission map (TM) is obtained by a novel region segmenting logic. Then, the TM is smoothed by guided image filtering (GF) with the edge information being preserved. Experiments reveal that the proposed algorithm eliminates the halo artifacts more effectively than SDCP or GF based dark channel prior (DCP) both subjectively and objectively. Also, it is faster than the soft matting based DCP.

Original languageEnglish
Title of host publicationEleventh International Conference on Graphics and Image Processing, ICGIP 2019
EditorsZhigeng Pan, Xun Wang
PublisherSPIE
ISBN (Electronic)9781510635234
DOIs
Publication statusPublished - 2020
Event11th International Conference on Graphics and Image Processing, ICGIP 2019 - Hangzhou, China
Duration: 12 Oct 201914 Oct 2019

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume11373
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

Conference11th International Conference on Graphics and Image Processing, ICGIP 2019
Country/TerritoryChina
CityHangzhou
Period12/10/1914/10/19

Keywords

  • Guided image filtering
  • Region segmentation logic
  • Segmenting dark channel prior
  • Single image dehazing

Fingerprint

Dive into the research topics of 'Single image dehazing using a novel criterion based segmenting dark channel prior'. Together they form a unique fingerprint.

Cite this