Deep crisp boundaries: From boundaries to higher-level tasks

Yupei Wang, Xin Zhao, Yin Li, Kaiqi Huang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

75 Citations (Scopus)
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 75
  • Captures
    • Readers: 59
see details

Abstract

Edge detection has made significant progress with the help of deep convolutional networks (ConvNet). These ConvNet-based edge detectors have approached human level performance on standard benchmarks. We provide a systematical study of these detectors' outputs. We show that the detection results did not accurately localize edge pixels, which can be adversarial for tasks that require crisp edge inputs. As a remedy, we propose a novel refinement architecture to address the challenging problem of learning a crisp edge detector using ConvNet. Our method leverages a top-down backward refinement pathway, and progressively increases the resolution of feature maps to generate crisp edges. Our results achieve superior performance, surpassing human accuracy when using standard criteria on BSDS500, and largely outperforming the state-of-the-art methods when using more strict criteria. More importantly, we demonstrate the benefit of crisp edge maps for several important applications in computer vision, including optical flow estimation, object proposal generation, and semantic segmentation.

Original languageEnglish
Article number8485388
Pages (from-to)1285-1298
Number of pages14
JournalIEEE Transactions on Image Processing
Volume28
Issue number3
DOIs
Publication statusPublished - Mar 2019
Externally publishedYes

Keywords

  • Boundary detection
  • deep learning

Fingerprint

Dive into the research topics of 'Deep crisp boundaries: From boundaries to higher-level tasks'. Together they form a unique fingerprint.

Cite this

Wang, Y., Zhao, X., Li, Y., & Huang, K. (2019). Deep crisp boundaries: From boundaries to higher-level tasks. IEEE Transactions on Image Processing, 28(3), 1285-1298. Article 8485388. https://doi.org/10.1109/TIP.2018.2874279