A stereo matching algorithm based on image segmentation and features point

Wang Guicai*, Wang Liang, Cui Pingyuan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

A novel method is presented based on image segmentation and features point for stereo matching. Firstly, we analyse texture of the original image for distinguishing less texture and similar texture regions, as a result, we can achieve image segmentation by label image texture region. Meanwhile, we can remove smaller regions by blob filter; Then, SIFT features point and matching can achieve reliable and sparse disparity; secondly, we can gain primly disparity with SAD area-based matching; Finally, according to distribution of SIFT matching features, disparity continuous constraint and minimum distance classifier, we can be successful to get disparity of image segmentation block. The results of experiment with standard test images show this paper presents a method is effective. Compared with traditional methods, the method can obtain quickly, dense and high precision disparity map.

Original languageEnglish
Title of host publicationProceedings of the 2009 2nd International Congress on Image and Signal Processing, CISP'09
DOIs
Publication statusPublished - 2009
Externally publishedYes
Event2009 2nd International Congress on Image and Signal Processing, CISP'09 - Tianjin, China
Duration: 17 Oct 200919 Oct 2009

Publication series

NameProceedings of the 2009 2nd International Congress on Image and Signal Processing, CISP'09

Conference

Conference2009 2nd International Congress on Image and Signal Processing, CISP'09
Country/TerritoryChina
CityTianjin
Period17/10/0919/10/09

Keywords

  • Disparity map
  • Features point
  • Image segmentation
  • Minimum distance classifier
  • Stereo matching

Fingerprint

Dive into the research topics of 'A stereo matching algorithm based on image segmentation and features point'. Together they form a unique fingerprint.

Cite this