DeepUNet: A Deep Fully Convolutional Network for Pixel-Level Sea-Land Segmentation

Ruirui Li*, Wenjie Liu, Lei Yang, Shihao Sun, Wei Hu, Fan Zhang, Wei Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

310 Citations (Scopus)

Abstract

Semantic segmentation is a fundamental research in optical remote sensing image processing. Because of the complex maritime environment, the sea-land segmentation is a challenging task. Although the neural network has achieved excellent performance in semantic segmentation in the last years, there were a few of works using CNN for sea-land segmentation and the results could be further improved. This paper proposes a novel deep convolution neural network named DeepUNet. Like the U-Net, its structure has a contracting path and an expansive path to get high-resolution optical output. But differently, the DeepUNet uses DownBlocks instead of convolution layers in the contracting path and uses UpBlock in the expansive path. The two novel blocks bring two new connections that are U-connection and Plus connection. They are promoted to get more precise segmentation results. To verify the network architecture, we construct a new challenging sea-land dataset and compare the DeepUNet on it with the U-Net, SegNet, and SeNet. Experimental results show that DeepUNet can improve 1-2% accuracy performance compared with other architectures, especially in high-resolution optical remote sensing imagery.

Original languageEnglish
Article number8370071
Pages (from-to)3954-3962
Number of pages9
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Volume11
Issue number11
DOIs
Publication statusPublished - Nov 2018
Externally publishedYes

Keywords

  • Fully convolutional network (FCN)
  • SeNet
  • U-Net
  • optical remote sensing image
  • sea-land segmentation

Fingerprint

Dive into the research topics of 'DeepUNet: A Deep Fully Convolutional Network for Pixel-Level Sea-Land Segmentation'. Together they form a unique fingerprint.

Cite this