Research on pose point cloud matching error compensation method for confocal image assembly

Yijin Zhao, Xin Ye*, Lei Wang, Xinhai Yu, Heng Zhang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

High-precision matching of spatial poses is the key to ensuring the assembly performance of complex three-dimensional devices. The human eye cannot directly judge the three-dimensional pose of mesoscale devices, and high-precision observation instruments are needed to assist accurate judgment, ensuring smooth assembly. In this paper, based on the three-dimensional assembly system of micro-devices of confocal microscope, it is difficult to realize the problem of assembly alignment of spatial poses in the existing methods, this focuses on solving the problem of how to perform high-precision pose matching by using point cloud data measured through the confocal microscope.A new method for directly performing alignment of three-dimensional poses is proposed. The principle of point cloud registration ICP algorithm is studied, and the theory of point cloud axis hole registration theory is carried out.A neural network about the mapping relationship between the angle calculated by the RT matrix obtained by ICP algorithm and the theoretical angle is constructed. The comprehensive assembly accuracy of the whole method is analyzed to be 2.41μm. Finally, the actual experiment was carried out through the standard hole axis of 3μm for the single-edge gap and assembled successfully.

Original languageEnglish
Article number012073
JournalJournal of Physics: Conference Series
Volume1303
Issue number1
DOIs
Publication statusPublished - 2 Sept 2019
Event2nd International Conference on Mechanical, Electric and Industrial Engineering, MEIE 2019 - Hangzhou, China
Duration: 25 May 201927 May 2019

Fingerprint

Dive into the research topics of 'Research on pose point cloud matching error compensation method for confocal image assembly'. Together they form a unique fingerprint.

Cite this