Published in

Optica, Optics Express, 18(28), p. 26267, 2020

DOI: 10.1364/oe.397790

Links

Tools

Export citation

Search in Google Scholar

Deep residual learning for low-order wavefront sensing in high-contrast imaging systems

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Red circle
Preprint: archiving forbidden
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Sensing and correction of low-order wavefront aberrations is critical for high-contrast astronomical imaging. State of the art coronagraph systems typically use image-based sensing methods that exploit the rejected on-axis light, such as Lyot-based low order wavefront sensors (LLOWFS); these methods rely on linear least-squares fitting to recover Zernike basis coefficients from intensity data. However, the dynamic range of linear recovery is limited. We propose the use of deep neural networks with residual learning techniques for non-linear wavefront sensing. The deep residual learning approach extends the usable range of the LLOWFS sensor by more than an order of magnitude compared to the conventional methods, and can improve closed-loop control of systems with large initial wavefront error. We demonstrate that the deep learning approach performs well even in low-photon regimes common to coronagraphic imaging of exoplanets.