TY - GEN
T1 - Fast and Robust Point Cloud Registration with Tree-based Transformer
AU - Chen, Guangyan
AU - Wang, Meiling
AU - Yang, Yi
AU - Yuan, Li
AU - Yue, Yufeng
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Point cloud registration is essential in computer vision and robotics. Recently, transformer-based methods have achieved advanced point cloud registration performance. However, the standard attention mechanism utilized in these methods considers many low-relevance points, and it has difficulty focusing its attention weights on sparse and meaningful points, leading to limited local structure modeling capabilities and quadratic computational complexity. To address these limitations, we present the Tree-based Transformer (TrT), which is able to extract abundant local and global features with linear computational complexity. Specifically, the TrT builds coarse-to-dense feature trees, and a novel Tree-based Attention (TrA) is proposed to guide the progressive convergence of the attended regions toward meaningful points and to structurize point clouds following tree structures. In each layer, the top {mathcal{S}} key points with the highest attention scores are selected, such that in the next layer, attention is evaluated only within the specified high-relevance regions, corresponding to the child points of these selected {mathcal{S}} points. Additionally, coarse features containing high-level semantic information are incorporated into the child points to guide the feature extraction process, facilitating local structure modeling and multiscale information integration. Consequently, TrA enables the model to focus on critical local structures and extract rich local information with linear computational complexity. Experiments demonstrate that our method achieves state-of-the-art performance on 3DMatch and KITTI benchmarks. The code for our method is publicly available at https://github.com/CGuangyan-BIT/TrT.
AB - Point cloud registration is essential in computer vision and robotics. Recently, transformer-based methods have achieved advanced point cloud registration performance. However, the standard attention mechanism utilized in these methods considers many low-relevance points, and it has difficulty focusing its attention weights on sparse and meaningful points, leading to limited local structure modeling capabilities and quadratic computational complexity. To address these limitations, we present the Tree-based Transformer (TrT), which is able to extract abundant local and global features with linear computational complexity. Specifically, the TrT builds coarse-to-dense feature trees, and a novel Tree-based Attention (TrA) is proposed to guide the progressive convergence of the attended regions toward meaningful points and to structurize point clouds following tree structures. In each layer, the top {mathcal{S}} key points with the highest attention scores are selected, such that in the next layer, attention is evaluated only within the specified high-relevance regions, corresponding to the child points of these selected {mathcal{S}} points. Additionally, coarse features containing high-level semantic information are incorporated into the child points to guide the feature extraction process, facilitating local structure modeling and multiscale information integration. Consequently, TrA enables the model to focus on critical local structures and extract rich local information with linear computational complexity. Experiments demonstrate that our method achieves state-of-the-art performance on 3DMatch and KITTI benchmarks. The code for our method is publicly available at https://github.com/CGuangyan-BIT/TrT.
UR - http://www.scopus.com/inward/record.url?scp=85202437716&partnerID=8YFLogxK
U2 - 10.1109/ICRA57147.2024.10610004
DO - 10.1109/ICRA57147.2024.10610004
M3 - Conference contribution
AN - SCOPUS:85202437716
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 773
EP - 780
BT - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Robotics and Automation, ICRA 2024
Y2 - 13 May 2024 through 17 May 2024
ER -