Edge detection for optical synthetic apertures based on conditional generative adversarial networks

Mei Hui*, Yong Wu, Wenjie Tan, Ming Liu, Liquan Dong, Lingqin Kong, Yuejin Zhao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Detecting the interference fringes of the optical synthetic aperture is the core in preventing misalignments of the sub-mirrors in piston, tip, and tilt. These fringes are characterized as follows: (1) the edge information of sub-mirrors is accompanied by complex shapes and large gaps; and (2) the traditional edge detection algorithms have different optimal thresholds under different interference fringes, and they may lose boundary information. To address these problems, a novel method for detecting the edge of synthetic aperture fringe images is proposed. Because conditional generative adversarial networks avoid the difficulty of designing the loss function for specific tasks, they are suitable for our project. We trained over 8000 images based on real images and simulated images. Experiments prove that the proposed method can reduce the false detection rate to 0.2, compared with 0.56 by Canny algorithm. This method can also directly detect the fringe edge of the optical synthetic aperture systems, which are accompanied by varied shapes and a growing number of sub-mirrors. When the input images lose boundary information, the traditional algorithm does not restore the boundary, but the proposed method makes a decision globally, and thus it guesses and then fills the boundary. The maximum error of the generated boundary and the actual boundary is two pixels.

Original languageEnglish
Pages (from-to)2782-2788
Number of pages7
JournalApplied Optics
Volume58
Issue number11
DOIs
Publication statusPublished - 10 Apr 2019

Fingerprint

Dive into the research topics of 'Edge detection for optical synthetic apertures based on conditional generative adversarial networks'. Together they form a unique fingerprint.

Cite this