1 / 20

Least Squares Migration of Stacked Supergathers

Least Squares Migration of Stacked Supergathers. Wei Dai and Gerard Schuster KAUST. vs. RTM Problem & Possible Soln. Problem: RTM computationally costly; IO high Solution: Multisource LSM RTM. Preconditioning speeds up by factor 2-3

zyta
Download Presentation

Least Squares Migration of Stacked Supergathers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Least Squares Migration of Stacked Supergathers Wei Dai and Gerard Schuster KAUST vs

  2. RTM Problem & Possible Soln. Problem: RTM computationally costly; IO high Solution: Multisource LSM RTM Preconditioning speeds up by factor 2-3 Encoded LSM reduces crosstalk. Reduced comp. cost+memory

  3. Outline Motivation Multisource LSM theory Signal-to-Noise Ratio (SNR) Numerical results Conclusions

  4. Ld +L d 1 2 2 1 = L d +L d + =[L +L ](d + d ) 1 2 1 2 1 1 2 2 d +d =[L +L ]m 1 2 1 2 mmig=LTd Multisource Migration: Phase Encoded Multisource Migration { { d L Forward Model: mmig T T mmig T T T T Crosstalk noise mmig T T T T = L d +L d + Ld +Ld 1 2 2 1 1 1 2 2 mmig = L d +L d 1 1 2 2 Standard migration

  5. Ld +L d 1 2 2 1 = L d +L d + =[L +L ](d + d ) 1 2 1 2 1 1 2 2 d +d =[L +L ]m 1 2 1 2 mmig=LTd Multisource Migration: Phase Encoded MultisrceLeast Squares Migration { { L d Forward Model: mmig T T T T T T m = m + (k+1) (k) Crosstalk noise Standard migration

  6. Outline Motivation Multisource LSM theory Signal-to-Noise Ratio (SNR) Numerical results Conclusions

  7. ~ ~ 1 GI G GS [S(t) +N(t) ] S S Standard Migration SNR Zero-mean white noise Assume: d(t) = Standard Migration SNR Neglect geometric spreading GS Cost ~ O(S) # CSGs # geophones/CSG + + + Iterative Multisrc. Mig. SNR Cost ~ O(I) Standard Migration SNR SNR= # iterations migrate stack migrate iterate . SNR= . . SNR=

  8. The SNR of MLSM image grows as the square root of the number of iterations. 7 GI SNR = SNR 0 300 1 Number of Iterations

  9. Multisource LSM Summary Stnd. MigMultsrc. LSM IO 1 1/100 GS GI S I Cost ~ SNR Resolution dx 1 1/2 Cost vs Quality: Can I<<S?

  10. Outline Motivation Multisource LSM theory Signal-to-Noise Ratio (SNR) Numerical results Conclusions

  11. The Marmousi2 Model 0 Z k(m) 3 0 X (km) 16 The area in the white box is used for SNR calculation. 200 CSGs. Born Approximation Conventional Encoding: Static Time Shift & Polarity Statics

  12. Conventional Source: KM vs LSM (50 iterations) Conventional KM 0 Z k(m) 1x 3 0 X (km) 16 Conventional KLSM 0 50x Z (km) 3 0 X (km) 16

  13. 200-source Supergather: Multisrc. KM vs LSM Multisource KM (1 iteration) 0 1 x Z k(m) 200 3 0 X (km) 16 Multisource KLSM (300 iterations) 0 I=1.5S Z (km) 1.5 x 3 0 X (km) 16

  14. What have we empirically learned? Stnd. MigMultsrc. LSM IO 1 1/200 1 1.5 Cost ~ S=200 I=300 SNR~ Resolution dx 1 1/2 Cost vs Quality: Can I<<S?

  15. Z (km) 0 SEG/EAGE Salt Reflectivity Model 1.4 Use constant velocity model with c = 2.67 km/s Center frequency of source wavelet f = 20 Hz 320 shot gathers, Born approximation 0 6 X (km) • Encoding: Dynamic time, polarity statics + wavelet shaping • Center frequency of source wavelet f = 20 Hz • 320 shot gathers, Born approximation

  16. Standard Phase Shift Migration vs MLSM (Yunsong Huang) Standard Phase Shift Migration (320 CSGs) 0 1 x Z k(m) 1.4 0 X (km) 6 Multisource PLSM (320 blended CSGs, 7 iterations) 0 Z (km) 1 x 44 1.4 0 X (km) 6

  17. Single-source PSLSM (Yunsong Huang) 1.0 Conventional encoding: Polarity+Time Shifts Model Error Unconventional encoding 0.3 0 Iteration Number 50

  18. What have we empirically learned? Stnd. MigMultsrc. LSM IO 1 1/320 1 1/44 Cost ~ I=7 S=320 SNR~ Resolution dx 1 1/2 Cost vs Quality: Can I<<S? Yes.

  19. ConclusionsMigvs MLSM 1. SNR: VS GS GI 2. Memory 1 vs1/S 2. Cost: S vsI 3. Caveat: Mig. & Modeling were adjoints of one another. LSM sensitive starting model 4. Unconventional encoding: I << S • Next Step: Sensitivity analysis to starting model

  20. Back to the Future? Evolution of Migration 1960s-1970s 1980s 1980s-2010 2010? Poststack migration Prestack migration Poststack encoded migration DMO

More Related