Abstract
In this paper we propose a probabilistic method for fusing depth maps in real time for wide-baseline situation. We treat the depth map fusion as a problem of probability density function (pdf) estimation. The original point cloud, instead of the reprojected depth map, is used to estimate the pdf, and a mathematical expectation computation method is proposed to reduce the complexity of the method. Experimental results show that the proposed method can get the fused depth map in real time, and is very promising for fusing depth maps from multiple depth cameras with sparsely distributed viewpoints.
Original language | English |
---|---|
Title of host publication | ICPR 2012 - 21st International Conference on Pattern Recognition |
Pages | 368-371 |
Number of pages | 4 |
Publication status | Published - 2012 |
Event | 21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan Duration: 11 Nov 2012 → 15 Nov 2012 |
Publication series
Name | Proceedings - International Conference on Pattern Recognition |
---|---|
ISSN (Print) | 1051-4651 |
Conference
Conference | 21st International Conference on Pattern Recognition, ICPR 2012 |
---|---|
Country/Territory | Japan |
City | Tsukuba |
Period | 11/11/12 → 15/11/12 |
Fingerprint
Dive into the research topics of 'Probabilistic depth map fusion for real-time multi-view stereo'. Together they form a unique fingerprint.Cite this
Yong, D., Mingtao, P., & Yunde, J. (2012). Probabilistic depth map fusion for real-time multi-view stereo. In ICPR 2012 - 21st International Conference on Pattern Recognition (pp. 368-371). Article 6460148 (Proceedings - International Conference on Pattern Recognition).