Published in

Optica, Optics Express, 13(28), p. 19218, 2020

DOI: 10.1364/oe.390878

Links

Tools

Export citation

Search in Google Scholar

Conformal convolutional neural network (CCNN) for single-shot sensorless wavefront sensing

Journal article published in 2020 by Yuanlong Zhang ORCID, Tiankuang Zhou, Lu Fang, Lingjie Kong ORCID, Hao Xie, Qionghai Dai ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Red circle
Preprint: archiving forbidden
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Wavefront sensing technique is essential in deep tissue imaging, which guides spatial light modulator to compensate wavefront distortion for better imaging quality. Recently, convolutional neural network (CNN) based sensorless wavefront sensing methods have achieved remarkable speed advantages via single-shot measurement methodology. However, the low efficiency of convolutional filters dealing with circular point-spread-function (PSF) features makes them less accurate. In this paper, we propose a conformal convolutional neural network (CCNN) that boosts the performance by pre-processing circular features into rectangular ones through conformal mapping. The proposed conformal mapping reduces the number of convolutional filters that need to describe a circular feature, thus enables the neural network to recognize PSF features more efficiently. We demonstrate our CCNN could improve the wavefront sensing accuracy over 15% compared to a traditional CNN through simulations and validate the accuracy improvement in experiments. The improved performances make the proposed method promising in high-speed deep tissue imaging.