1 / 25

Teleseismic surface wave tomography

chynna
Download Presentation

Teleseismic surface wave tomography

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From data to model: how should we handle uncertainties in a chain mixing model and data uncertainty?HelleA. Pedersen and GwenaelleSalaunISTERRE, University of Grenoble and CNRSPresentation based mainly on results from the Simbaad experiment: A. Paul, H. Karabulut, D. Hatzfeld, C. Papazachos, D. M. Childs, C. Pequegnat and Simbaad Teamas well as close collaboration with:V. Farra, M. Bruneton, S. Fishwick, D. Snyder, and others

  2. Teleseismic surface wave tomography • Give me an array of stations • Give me recordings of distant seismic events • I will give you the (=some) model of Vs(x,y,z) • Can I give you a sensible estimate of the error on Vs(x,y,z)?

  3. What we do • Preprocessing • At each frequency: • Measure time delays between pairs of stations • Invert for phase velocity maps C(x,y) • Assemble phase velocities to obtain C(x,y,period) • For each grid point, invert for Vs(z) • Assemble shear wave profiles to obtain Vs(x,y,z)

  4. Teleseismic surface wave tomography – different types of difficulties • Input data : data quantity and uncertainty • 2) Out of array propagation : simplistic models of the incoming waves • 3) Propagation effects inside the array: simplistic theory • Phase velocity uncertainty: resolution issue • Depth inversions uncertainty: resolution issue

  5. Data quality • Data heterogeneity is the norm rather than the exception

  6. Data quality • Data heterogeneity is the norm rather than the exception

  7. Data quality • Input data : data quantity and uncertainty – what can we reasonably do • Available events : • long recording period (2 years) • Signal to noise ratio of signals : • strict quality control and rejection of faulty signals …but choices remain subjective • Systematic errors (glitches, mass centerings, …): • we pick up automatically as much as possible, but not all is visible after preprocessing • Timing errors : • regular checks on P-wave arrivals (but what about small and/or random errors?). • Errors in metadata (instrument response) : • big effort => OK phases, amplitudes within ±30% (!) • A thorough (but is it satisfying?) analysis of remaining time delay errors

  8. Data quality • Question: • Can we assume that the remaining data errors follow a normal distribution?

  9. Out of array propagation : great-circle deviation • Major deterministic diffractions : systematic effects • Multiple diffraction and coda • Presence of higher modes (and body waves) • Great-circle deviation • Finite frequency effects Maupin, GJI 2011

  10. Out of array propagation : great-circle deviation 50s 25s Maupin, GJI 2011

  11. Out of array propagation : finite frequency effects 100s Love wave, phase and arrival angle kernels 100s Rayleigh wave, phase kernel Zhou, Dahlen and Nolet, GJI 2004 (Born)

  12. Out of array propagation : finite frequency effects 100s & 150km depth Chevrot and Zhao, GJI 2007

  13. Out of array effects • Actions: • - Carry out frequency-time filtering • - Allow for great circle deviation • - Allow for non plane wavefronts Question: Can wecreate a model of the errorsassociatedwithour approximations on wave propagation? Is itenough to have sensible ‘garbageparameters’ to avoid to project out of arrayeffectsinto the model? Bruneton et al., GJI, 2002

  14. Inside array propagation :possible to extract useful information from fundamental mode Rayleigh waves using strong approximations (no scattering, no finite frequency effects) Scattering by a mountainroot Phase velocityat T=50s observedacross a circularheterogeneity of 40 km diameter R->R R->L L->L Bodin and Maupin, GJI, 2008 Snieder, GJRaS, 1986

  15. Phase velocity maps and uncertainty

  16. Without oceanic slab With oceanic slab • Phase velocity maps and uncertainty: data distribution Examplefromteleseismic P wavetomography But we are fine – aren’twe?

  17. Phase velocity uncertainty: data distribution Input Without back azimuthweighting With back azimuthweighting

  18. Phase velocity uncertainty: a posteriori error maps • (of last inversion step)

  19. Phase velocity uncertainty Questions: 1) Simple, objective tools for regularisation? 2) How can wedeveloptools to betterassess the impact of input data weighting?

  20. Vs(z) uncertainty - Smooth over depth (correlationlength) - Importance of interfaces

  21. Vs(x,y,z) uncertainty Questions: Laterallyvaryingdepthsmoothing? On whichcriteria? Usualresolution issues

  22. Putting the pieces together again Data processing and delaymeasurements Estimate of measurementerror (Gaussian?) Ignored out of arrayeffects Iterative inversion for phase velocitymaps Resolution : managed, but partlyunsatisfactorytrade-off between spatial resolution and parameterresolution ? Ignoredinsidearrayeffects Iterative inversion for Vs(z) Resolution: managed, but partlyunsatisfactorytrade-off between spatial resolution and parameterresolution

  23. Going back to where we started • Give me an array of stations • Give me recordings of distant seismic events • I will give you the (=some) model of Vs(x,y,z) • It is possible to give you some estimate of the error on Vs(x,y,z) • BUT • All this careful work boils down to: which features of the model are resolved – and our tools for error analysis may not be adequate • Many open issues, forcing us to make conservative choices at each step, thereby reducing vertical and lateral resolution • What we need is the uncertainty estimate on parameters of our interpretation, not the uncertainty on the parameters themselves • We therefore have got a strong communication problem of error issues towards the end-users (geology/tectonics)

  24. What do end users do to the output models? Top of slabgeometry DippingslabbeneithAnatolia (?)

  25. What I would like to learn? • What I am expecting from uncertainty analysis: • Estimate of the probability function of parameters relating to the interpretation • Tools for others to estimate the probability function of their interpretation • Tools to decide how to decrease uncertainties : where will an effort (data, physics, inversion) be the most efficient? • Can statistical analysis alone solve these issues? • Can we create a library synthetic seismograms for different test structures and array geometries? • Such a library must make us able to use different analysis techniques • Which hypothesis can we sensibly test? • How can such a library be hypothesis driven (?) • How can we create models of data uncertainty?

More Related