An efficient level set model with self-similarity for texture segmentation

Lixiong Liu*, Shengming Fan, Xiaodong Ning, Lejian Liao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

Textures widely exist in the natural scenes while traditional level set models generally use only intensity information to construct energy and ignore the inherent texture features. Thus these models have difficulty in segmenting texture images especially when the texture objects have similar intensity to the background. To solve this problem, we propose a new level set model for texture segmentation that considers the impact of local Gaussian distribution fitting (LGDF), local self-similarity (LSS) and a new numerical scheme on the evolving contour. The proposed method first introduces a texture energy term based on the local self-similarity texture descriptor to the LGDF model, and then the evolving contour could effectively snap to the textures boundary. Secondly, a lattice Boltzmann method (LBM) is deployed as a new numerical scheme to solve the level set equation, which can break the restriction of the Courant–Friedrichs–Lewy (CFL) condition that limits the time step of iterations in former numerical schemes. Moreover, GPU acceleration further improves the efficiency of the contour evolution. Experimental results show that our model can effectively handle the segmentation of synthetic and natural texture images with heavy noises, intensity inhomogeneity and messy background. At the same time, the proposed model has a relatively low complexity.

Original languageEnglish
Pages (from-to)150-164
Number of pages15
JournalNeurocomputing
Volume266
DOIs
Publication statusPublished - 29 Nov 2017

Keywords

  • Lattice Boltzmann method
  • Level set
  • Local self-similarity
  • Texture segmentation

Fingerprint

Dive into the research topics of 'An efficient level set model with self-similarity for texture segmentation'. Together they form a unique fingerprint.

Cite this