摘要
In this paper we propose a probabilistic method for fusing depth maps in real time for wide-baseline situation. We treat the depth map fusion as a problem of probability density function (pdf) estimation. The original point cloud, instead of the reprojected depth map, is used to estimate the pdf, and a mathematical expectation computation method is proposed to reduce the complexity of the method. Experimental results show that the proposed method can get the fused depth map in real time, and is very promising for fusing depth maps from multiple depth cameras with sparsely distributed viewpoints.
源语言 | 英语 |
---|---|
主期刊名 | ICPR 2012 - 21st International Conference on Pattern Recognition |
页 | 368-371 |
页数 | 4 |
出版状态 | 已出版 - 2012 |
活动 | 21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, 日本 期限: 11 11月 2012 → 15 11月 2012 |
出版系列
姓名 | Proceedings - International Conference on Pattern Recognition |
---|---|
ISSN(印刷版) | 1051-4651 |
会议
会议 | 21st International Conference on Pattern Recognition, ICPR 2012 |
---|---|
国家/地区 | 日本 |
市 | Tsukuba |
时期 | 11/11/12 → 15/11/12 |
指纹
探究 'Probabilistic depth map fusion for real-time multi-view stereo' 的科研主题。它们共同构成独一无二的指纹。引用此
Yong, D., Mingtao, P., & Yunde, J. (2012). Probabilistic depth map fusion for real-time multi-view stereo. 在 ICPR 2012 - 21st International Conference on Pattern Recognition (页码 368-371). 文章 6460148 (Proceedings - International Conference on Pattern Recognition).