In high-contrast imaging applications, such as the direct imaging of exoplanets, a coronagraph is used to suppress the light from an on-axis star so that a dimmer, off-axis object can be imaged. To maintain a high-contrast dark region in the image, optical aberrations in the instrument must be minimized. The use of phase-contrast-based Zernike Wavefront Sensors (ZWFS) to measure and correct for aberrations has been studied for large segmented aperture telescopes and ZWFS are planned for the coronagraph instrument on the Roman Space Telescope (RST). ZWFS enable subnanometer wavefront sensing precision, but their response is nonlinear. Lyot-based Low-OrderWavefront Sensors (LLOWFS) are an alternative technique, where light rejected from a coronagraph's Lyot stop is used for linear measurement of small wavefront displacements. Recently, the use of Deep Neural Networks (DNNs) to enable phase retrieval from intensity measurements has been demonstrated in several optical configurations. In a LLOWFS system, the use of DNNs rather than linear regression has been shown to greatly extend the sensor's usable dynamic range. In this work, we investigate the use of two different types of machine learning algorithms to extend the dynamic range of the ZWFS. We present static and dynamic deep learning architectures for single- and multi-wavelength measurements, respectively. Using simulated ZWFS intensity measurements, we validate the network training technique and present phase reconstruction results. We show an increase in the capture range of the ZWFS sensor by a factor of 3.4 with a single wavelength and 4.5 with four wavelengths.
|