Deep surface normal estimation with hierarchical RGB-D fusion

Jin Zeng, Yanfeng Tong, Yunmu Huang, Qiong Yan, Wenxiu Sun, Jing Chen, Yongtian Wang

科研成果: 书/报告/会议事项章节会议稿件同行评审

54 引用 (Scopus)

摘要

The growing availability of commodity RGB-D cameras has boosted the applications in the field of scene understanding. However, as a fundamental scene understanding task, surface normal estimation from RGB-D data lacks thorough investigation. In this paper, a hierarchical fusion network with adaptive feature re-weighting is proposed for surface normal estimation from a single RGB-D image. Specifically, the features from color image and depth are successively integrated at multiple scales to ensure global surface smoothness while preserving visually salient details. Meanwhile, the depth features are re-weighted with a confidence map estimated from depth before merging into the color branch to avoid artifacts caused by input depth corruption. Additionally, a hybrid multi-scale loss function is designed to learn accurate normal estimation given noisy ground-truth dataset. Extensive experimental results validate the effectiveness of the fusion strategy and the loss design, outperforming state-of-the-art normal estimation schemes.

源语言英语
主期刊名Proceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019
出版商IEEE Computer Society
6146-6155
页数10
ISBN(电子版)9781728132938
DOI
出版状态已出版 - 6月 2019
活动32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019 - Long Beach, 美国
期限: 16 6月 201920 6月 2019

出版系列

姓名Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
2019-June
ISSN(印刷版)1063-6919

会议

会议32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019
国家/地区美国
Long Beach
时期16/06/1920/06/19

指纹

探究 'Deep surface normal estimation with hierarchical RGB-D fusion' 的科研主题。它们共同构成独一无二的指纹。

引用此