1 / 56

Wavefront Sensing I

Wavefront Sensing I. Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand. Location. Astronomical Imaging Group past and present. Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones

Download Presentation

Wavefront Sensing I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand

  2. Location

  3. Astronomical Imaging Grouppast and present Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones Associate Professor Peter Cottrell Professor Richard Bates Dr Bonnie Law Dr Roy Irwan Dr Rachel Johnston Dr Marcos van Dam Dr Valerie Leung Richard Clare Yong Chew Judy Mohr

  4. Contents • Session 1 – Principles • Session 2 – Performances • Session 3 – Wavefront Reconstruction for 3D

  5. Principles of wavefront sensing • Introduction • Closed against open loop wavefront sensing • Nonlinear wavefront sensing • Shack-Hartmann • Curvature • Geometric • Conclusions

  6. Imaging a star

  7. The effect of turbulence

  8. Distorted incoming wavefront telescope Image plane Deformable mirror Wavefront sensor Adaptive Optics system

  9. Closed loop system Reduces the effects of disturbances such as telescope vibration, modelling errors by the loop gain Design limited by stability constraints Does not inherently improve the noise performance unless the closed loop measurements are easier to make

  10. Postprocessing systemfeedforward compensation Fixed mirror Distorted incoming wavefront Detector plane telescope Computer Image Wavefront sensor

  11. Open loop system (SPID) Sensitive to modelling errors No stability issues with computer post processing Problem is not noise but errors in modelling the system T time Temporal coherence of the atmosphere

  12. Modelling the problem (step1) • The relationship between the measured data and the object and the point spread function is linear Data object convolution point spread function noise (psf) • A linear relationship would mean that if we multiply the input by α we multiply the output by α. The output doesn’t change form

  13. Modelling the problem (step 2) • The relationship between the phase and the psf is non linear psf Fourier magnitude phase correlation transform

  14. Phase retrieval • Nonlinearity caused by 2p wrapping interacting with smoothing Wrapped ambiguity Correct MAP estimate ML estimation MAP estimation

  15. Role of typical wavefront sensor • To produce a linear relationship between the measurements and the phase • Speeds up reconstruction • Guarantees a solution • Degrades the ultimate performance phase weighting basis function

  16. Solution is by linear equations Measurement Interaction Basis function vector matrix Coefficents • ith column of Θ corresponds to the measurement that would occur if the phase was the ith basis function • Three main issues • What has been lost in linearising? • How well you can solve the system of equations? • Is it the right equations?

  17. The effect of turbulence There is a linear relationship between the mean slope of the phase in a direction and the displacement of the image in that direction.

  18. Trivial example • There is a linear relationship between the mean slope and the displacement of the centroid • Measurements are the centroids of the data • Interaction matrix is the scaled identity • Reconstruct the coefficients of the tip and tilt

  19. Quality of the reconstruction • The centroid proportional to the mean slope (Primot el al, Welsh et al). • The best Strehl requires estimating the least mean square (LMS) phase (Glindemann). • To distinguish the mean and LMS slope you need to estimate the coma and higher order terms LMS slope Mean slope Phase

  20. Difference between the lms and mean tilt Ideal image Coma distortion • Peak value is better than the centroid for optimising the Strehl • Impractical for low light data Detected image

  21. Where to from here • The real problem is how to estimate higher aberration orders. • Wavefront sensor can be divided into: • pupil plane techniques, that measure slopes (curvatures) in the divided pupil plane, • Shack-Hartmann • Curvature (Roddier), Pyramid (Ragazonni) • Lateral Shearing Interferometers • Image plane techniques that go directly from data in the image plane to the phase (nonlinear) • Phase diversity (Paxman) • Phase retrieval

  22. Geometric wavefront sensing • Pyramid, Shack-Hartmann and Curvature sensors are all essentially geometric wavefront sensors • Rely on the fact that light propagates perpindicularly to the wavefront. • A linear relationship between the displacement and the slope • Essentially achromatic

  23. W(x) z x Geometric optics model • A slope in the wave-front causes an incoming photon to be displaced by • Model is independent of wavelength and spatial coherence.

  24. Generalized wave-front sensor • This is the basis of the two most common wave-front sensors. Converging lens Aberration Focal plane Shack-Hartmann Curvature sensor

  25. Trade-off • For fixed photon count, you trade off the number of modes you can estimate in the phase screen against the accuracy with which you can estimate them • To estimate a high number of modes you need good resolution in the pupil plane • To make the estimate accurately you need good resolution in the image plane

  26. Properties of a wave-front sensor • Linearization: want a linear relationship between the wave-front and the measurements. • Localization: the measurements must relate to a region of the aperture. • Broadband: the sensor should operate over a wide range of wavelengths.  Geometric Optics regime

  27. Explicit division of the pupil Direct image Shack-Hartmann

  28. Shack-Hartmann sensor • Subdivide the aperture and converge each subdivision to a different point on the focal plane. • A wave-front slope, Wx, causes a displacement of each image by zWx.

  29. Fundamental problem • Resolution in the pupil plane is inversely proportional to the resolution in the image plane • You can have good resolution in one but not both (Uncertainty principle) Pupil D w Image

  30. Loss of information due to subdivision • Cannot measure the average phase difference between the apertures • Can only determine the mean phase slope within an aperture • As the apertures become smaller the light per aperture drops • As the aperture size drops below r0 (Fried parameter) the spot centroid becomes harder to measure

  31. Subdivided aperture

  32. Implicit subdivision • If you don’t image in the focal plane then the image looks like a blurred version of the aperture • If it looks like the aperture then you can localise in the aperture

  33. Explanation of the underlying principle • If there is a deviation from the average curvature in the wavefront then on one side the image will be brighter than the other If there is no curvature from the atmosphere then it is equally bright on both sides of focus.

  34. Slope based analysis of the curvature sensor The displacement of light from one pixel to its neighbour is s determined by the slope of the wavefront

  35. Slope based analysis of the curvature sensor • The signal is the difference between two slope signals • →Curvature

  36. Phase information localisation in the curvature sensor • Diffraction blurring + geometric expansion

  37. Curvature sensing • Localization comes from the short effective propagation distance, • Linear relationship between the curvature in the aperture and the normalized intensity difference: • Broadband light helps reduce diffraction effects.

  38. Curvature sensing signal • The intensity signal gives an approximate estimate of the curvature. • Two planes help remove scintillation effects Simulated intensity measurement Curvature sensing estimate

  39. Irradiance transport equation • Linear approximation gives

  40. Solution inside the boundary • There is a linear relationship between the signal and the curvature. • The sensor is more sensitive for large effective propagation distances.

  41. Solution at the boundary (mean slope) • If the intensity is constant at the aperture, H(z) = Heaviside function I1 I2 I1-I2

  42. The wavefront also changes • As the wave propagates, the wave-front changes according to: • As the measurement approaches the focal plane the distortion of the wavefront becomes more important, and needs to be incorpoarated (van Dam and Lane)

  43. Non-linearity due to the wavefront changing • As a consequence the intensity also changes! • So, to second order : • The sensor is non-linear!

  44. Origin of terms Due to the difference in the curvature in the x- and y- directions (astigmatism). Due to the local wave-front slope, displacing the curvature measurement.

  45. Consequences of the analysis • As z increases, the curvature sensor is limited by nonlinearities K and T. • A third-order diffraction term limits the spatial resolution to

  46. Analysis of the curvature sensor As the propagation distance, z, increases, • Sensitivity increases. • Spatial resolution decreases. • The relationship between the signal and the curvature becomes non-linear.

  47. Fundamental conflict between: Sensitivity which dictates moving the detection planes toward the focal plane Aperture resolution which dictates that the planes should be closer to the aperture Tradeoff in the curvature sensor

  48. W(x) z x Geometric optics model • Slopes in the wave-front causes the intensity distribution to be stretched like a rubber sheet • Wavefront sensing maps the distribution backto uniform

  49. Intensity distribution as a PDF • The intensity can be viewed as a probability density function (PDF) for photon arrival. • As the wave propagates, the PDF evolves. • The cumulative distribution function (CDF) also changes.

  50. Take two propagated images of the aperture. D=1 m, r0=0.1 m and λ=589 nm. Intensity at -z Intensity at z

More Related