Object-independent piston diagnosing approach for segmented optical mirrors via deep convolutional neural network

Mei Hui*, Weiqian Li, Ming Liu, Liquan Dong, Lingqin Kong, Yuejin Zhao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

Piston diagnosing approaches based on neural networks have shown great success, while a few methods are heavily dependent on the imaging target of the optical system. In addition, they are inevitably faced with the interference of submirrors. Therefore, a unique object-independent feature image is used to form an original kind of data set. Besides, an extremely deep image-based convolutional neural network (CNN) of 18 layers is constructed. Furthermore, 9600 images are generated as a data set for each submirror with a special measure of sensitive area extracting. The diversity of results among all the submirrors is also analyzed to ensure generalization ability. Finally, the average root mean square error of six submirrors between the real piston values and the predicted values is approximately 0.0622λ. Our approach has the following characteristics: (1) the data sets are object-independent and contain more effective details, which behave comparatively better in CNN training; (2) the complex network is deep enough and only a limited number of images are required; (3) the method can be applied to the piston diagnosing of segmented mirror to overcome the difficulty brought by the interference of submirrors. Our method does not require special hardware, and is fast to be used at any time, which may be widely applied in piston diagnosing of segmented mirrors.

Original languageEnglish
Pages (from-to)771-778
Number of pages8
JournalApplied Optics
Volume59
Issue number3
DOIs
Publication statusPublished - 20 Jan 2020

Fingerprint

Dive into the research topics of 'Object-independent piston diagnosing approach for segmented optical mirrors via deep convolutional neural network'. Together they form a unique fingerprint.

Cite this