HandAttNet: Attention 3D Hand Mesh Estimation Network

Jintao Sun, Gangyi Ding

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Hand pose estimation and reconstruction become increasingly compelling in the metaverse era. But in reality hands are often heavily occluded, which makes the estimation of occluded 3D hand meshes challenging.Previous work tends to ignore the information of the occluded regions, we believe that the occluded regions hand information can be highly utilized, Therefore, in this study, we propose hand mesh estimation network, HandAttNet.We design the cross-attention mechanism module and the DUO-FIT module to inject hand information into the occluded region.Finally, we use the self-attention regression module for 3D hand mesh estimation.Our HandAttNet achieves SOTA performance.

源语言英语
主期刊名Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023
出版商Institute of Electrical and Electronics Engineers Inc.
645-646
页数2
ISBN(电子版)9798350348392
DOI
出版状态已出版 - 2023
活动2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023 - Shanghai, 中国
期限: 25 3月 202329 3月 2023

出版系列

姓名Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023

会议

会议2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023
国家/地区中国
Shanghai
时期25/03/2329/03/23

指纹

探究 'HandAttNet: Attention 3D Hand Mesh Estimation Network' 的科研主题。它们共同构成独一无二的指纹。

引用此

Sun, J., & Ding, G. (2023). HandAttNet: Attention 3D Hand Mesh Estimation Network. 在 Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023 (页码 645-646). (Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/VRW58643.2023.00165