Probabilistic depth map fusion for real-time multi-view stereo

Duan Yong*, Pei Mingtao, Jia Yunde

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Citations (Scopus)

Abstract

In this paper we propose a probabilistic method for fusing depth maps in real time for wide-baseline situation. We treat the depth map fusion as a problem of probability density function (pdf) estimation. The original point cloud, instead of the reprojected depth map, is used to estimate the pdf, and a mathematical expectation computation method is proposed to reduce the complexity of the method. Experimental results show that the proposed method can get the fused depth map in real time, and is very promising for fusing depth maps from multiple depth cameras with sparsely distributed viewpoints.

Original languageEnglish
Title of host publicationICPR 2012 - 21st International Conference on Pattern Recognition
Pages368-371
Number of pages4
Publication statusPublished - 2012
Event21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan
Duration: 11 Nov 201215 Nov 2012

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference21st International Conference on Pattern Recognition, ICPR 2012
Country/TerritoryJapan
CityTsukuba
Period11/11/1215/11/12

Fingerprint

Dive into the research topics of 'Probabilistic depth map fusion for real-time multi-view stereo'. Together they form a unique fingerprint.

Cite this