1 / 45

Woodblock Print, from “Thirty-Six Views of Mt. Fuji”, by K. Hokusai, ca. 1830

Woodblock Print, from “Thirty-Six Views of Mt. Fuji”, by K. Hokusai, ca. 1830. Research in Earthquake Physics, Forecasting, and Simulation-Based Probabilistic Hazard Assessment at the University of California, Davis http://quakesim.jpl.nasa.gov. John B. Rundle University of California, Davis

keene
Download Presentation

Woodblock Print, from “Thirty-Six Views of Mt. Fuji”, by K. Hokusai, ca. 1830

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Woodblock Print, from “Thirty-Six Views of Mt. Fuji”, by K. Hokusai, ca. 1830 Research in Earthquake Physics, Forecasting, and Simulation-Based Probabilistic Hazard Assessment at the University of California, Davis http://quakesim.jpl.nasa.gov John B. Rundle University of California, Davis in Collaboration with the QuakeSim Team Presented at Disaster Prevention Research Institute, Kyoto Univesity Kyoto, Japan, October 13, 2004

  2. Collaborators Andrea Donnellan, Division of Earth and Space Science, JPL Geoffrey Fox, Dept. of Computer Science, University of Indiana Robert Granat ,Data Understanding Systems Group, JPL Lisa Grant, Dept. of Environmental Science, University of California, Irvine James Holliday, Dept. of Physics and CSE, University of California, Davis Peggy Li, Parallel Applications Technologies Group, JPL Bill Klein, Dept. of Physics, Boston Univ.; and CNLS, LANL Greg Lyzenga, Dept. of Physics, Harvey Mudd College Jorge Martins, Universidad Fluminense, Niteroi, Brazil Dennis McLeod, Dept. of Computer Science, USC Gleb Morein, CSE, University of California, Davis Kazuyoshi Nanjo, CSE, UC Davis and Tohoku University, Japan Jay Parker, Satellite Geodesy and Geodetic Systems, JPL Marlon Pierce, Community Grids Lab, University of Indiana Paul Rundle, Department of Physics, Harvey Mudd College; and CSE, University o f California, Davis Robert Shcherbakov, CSE, University of California, Davis Kristy Tiampo, Dept. of Geology, University of Western Ontario, London, Ontario Terry Tullis, Dept. of Geology, Brown University Don Turcotte, Dept. of Geology, University of California, Davis ( JB Rundle et al., Phys. Rev. E, v 61, p. 2416, 2000; AGU Monograph, Geocomplexity & the Physics of Earthquakes, ed. JB Rundle, DL Turcotte, W. Klein, 2000; PB Rundle et al., Phys. Rev. Lett., 87, 148501 (2001); J.B. Rundle, K.F. Tiampo, W. Klein and J.S.S. Martins, Proc. Nat. Acad. Sci. USA, 99, Supplement 1, 2514-2521, (2002) )

  3. A.D. 1857 1480 1812 1346 1680 1100 Space-Time Patterns of Historic Earthquakes Earthquakes on major faults occur quasi-periodically Major events on the San Andreas Fault From K. Sieh et al., JGR, 94, 603 (1989)

  4. Virtual California 2001 Represents All the Major Active Strike Slip Faults in California (PB Rundle et al, to be submitted, 2004) Faults in RED are shown superposed on a LandSat image of California. (Image courtesy of Peggy Li, JPL)

  5. Time-Averaged Properties Space-Time Variability Simulation The Virtual_California Simulation: Present Characteristics & Properties Codes Available from: http://quakesim.jpl.nasa.gov Backslip model– Topology of fault system does not evolve. Stress accumulation occurs as a result of negative slip, or “backslip”. Linear interactions (stress transfer)- Interactions (stress Greens functions) are purely elastic. Arbitrarily complex 3D fault system topologies-- All faults are vertical strike-slip faults. Boundary element mesh is ~ 10 km horizontal, 15 km vertical. Faults are embedded in an elastic half space and topologies do not evolve. Friction laws--Based on laboratory experiments of Tulis-Karner-Marone. Uses random overshoot/undershoot during slip. Also includes stress-rate criterion for initiation of sliding. Uses Cellular Automaton method to solve stochastic evolution equations. Data assimilation–Static method using large historic earthquakes to compute friction parameters. Geologic data used to determine long term slip rates.

  6. F Baseline values for parameters  are determined for each fault segment. It can easily be shown that: 2 = So  is an observable quantity. We use data from Deng and Sykes (1997) to determine the average values of stable interseismic, aseismic slip for many faults in California.  > 0 Stress R Time F  = 0 Stress,  R Data from T Tullis, PNAS, 1996 Average stable aseismic slip Total slip Remarks on Friction: Aseismic Slip (“Stress Leakage”) Factor   determines fraction of total slip that is stable aseismic slip.

  7. “Dynamic” stress triggering term Stochastic Friction: Mathematical Details Friction experiments can be described by a stochastic “Leaky Threshold Equation”, or “Hopfield Equation”: Friction Equation on each segment: d / dt = KVLoad-  { + ( t – tF ) +  ( / t –  )}+ (t) “Rate of change of stress = Rate of stress supply – Rate of stress dissipation” Also:  =  - R tF represents all times for which:  ( t F ) = F.VLoad is the segment loading rate (long term fault slip rate), and K is the segment stiffness. The  - function parametrizes the sudden slip event. The factor  characterizes the rate at which stress increases on fault segments.  is a stochastic noise.

  8. Fault Data Friction Data Friction & Physics Modules Computed Surface Deformation Computed Earthquake Histories Observed Earthquake Histories & Deformation Model Steering via Data Assimilation Forecasts & Predictions Numerical Earthquake Forecasting

  9. Examples of Large Single Events Two large events on the San Andreas Fault. The dark bar indicates the epicentral segment. Both events are non-local. That is, slip is not only on contiguous fault segments, but slip also occurs on disjoint segments from the main group of slipped segments.

  10. Garlock Snapshot of Space-Time Dynamics Color Cycle Denotes CFF Buildup and Release Log [ 1-CFF(x,t) ] Color Cycle 2000 Years of Elapsed Time Time (Yr) Space (Distance, km) Creeping NSAF SSAF

  11. Line of slope = -1 Gutenberg-Richter Relation

  12. Aftershocks from the 1906 San Francisco Earthquake During the Following 48 Hours (AJ Meltzner and DJ Wald, BSSA 93, 2160-2186, 2003)

  13. Space-Time Clustering (Images Produced with Virtual California code by P. Li, JPL) Time Index = 963 Time Index = 190

  14. Simulations and space-time patterns revealed by InSAR fringes Each movie frame represents surface deformation over the previous 5 years. Frames advance 1 year at a time. (Movie courtesy of P Li, JPL). Simulation of 1000 years of California Earthquakes Obtained from System-Level Virtual California Model of Earthquake Dynamics http://pat.jpl.nasa.gov/public/RIVA/images.html Clustering of events in space and time can be seen.

  15. Application II: Statistical Earthquake Forecasting Current Method of WG Below (Working Group on Calif. Earthquake Probabilities, Probabilities of Large Earthquakes in the San Francisco Bay Region, Califorinia, USGS Circular 1053, 1990)

  16. Forecasting Simulated San Andreas Events Using the Working Group Method Computing the 30-Year Hazard Function for Large Bay Area events (M>6.7). This forecast does not consider time T since last great Northern San Andreas earthquake Let FT(t) = Prob (t<T). Then the 30-year discrete Hazard Function, which is the expected failure rate over a 30 time span beginning at time T, is defined as: HD(t) = { FT( t+30 ) – FT(t) } / {1 – FT(t) } HD(t) is the conditional probability that failure occurs during the 30 time window, given that it has not occurred prior to the start of the time window: HD(t) = P( T<t<T+30 | t>T) Note that HD(t) computed from the simulation data is qualitatively different from the function computed from the Log Normal and BPT distributions, due to the long tails of the Log Normal and BPT (there are no infinite intervals in the simulations).

  17. Forecasting the Next Great San Francisco Earthquake on the Northern San Andreas Fault We compute (measure) a family of conditional probability distributions Pm(t<T), which have the condition that that no great San Francisco earthquake has occurred since the last such event. Pm(t<T) is the probability that an event with magnitude M > m will occur prior to a time T from the present time. Here we show computations for the cases in which an event of either m = 7.2 or m = 7.4 occurs on the yellow section of the Northern SAF. The yellow region is .25  Pm(t<T)  .75, the middle 50%. The red diamond represents the value for today, 98.2 years after the great 1906 San Francisco earthquake. The dashed red vertical lines represent the average recurrence intervals, 173.8 years for m = 7.2, and 343.8 years for m = 7.4

  18. Pattern Informatics in Earthquake Data: Southern California, 1932-1999 Earthquakes in southern California have been systematically recorded since 1932. The rate at which these events occur can be used to define activity time series in 11 km x 11 km spatial boxes that can be used to find the spatial patterns (KL: Karhunen-Loeve, also known as PCA) Figures courtesy KF Tiampo Below is a map of the first PCA mode, which we call the “Hazard Mode”. Red areas tend to be active or inactive at the same time. Above is a map of the relative intensity of seismic activity in southern California, 1932-1999. This can be considered to be a seismic “hazard map”. Above is a map of the second PCA mode, which we call the “Landers Mode”. Red areas tend to be inactive when blue areas are active & vice versa. All sites in a blue or red area tend to be active (or inactive) at the same time.

  19. Forecasting Earthquakes via Pattern Informatics (“Hotspot”) Method Earthquake activity over a period of time can be represented by a state vector (x,t), which can be written as a sum over KL eigenfunctions. Differences in state vectors have been found to represent a probability measure for future activity. Method analyzes the shifting patterns of earthquakes through time. • Using Pattern Informatics to generate an earthquake forecast (2000 to ~ 2010) • 1. Spatially coarse grain (tile) the region with boxes .1o x .1o on a side (~3000 boxes, ~ 2000 with at least one earthquake from 1932 to 2000). This scale is approximately the size of a M ~ 6 earthquake, although method seems to be sensitive down to a level of M  5. • 2. 1(xi)  Temporal average of seismic in box iactivity from 1932 to 1990 for earthquakes with M  3 averaged over 11 km x 11 km boxes • 3. 2(xi)  Temporal average of activity from 1932 to 2000 for earthquakes earthquakes with M  3 averaged over 11 km x 11 km boxes • 4. (xi) = 2(xi) -  1(xi)  Change in average activity, 1990 to 2000, for earthquakes with M  3 averaged over 11 km x 11 km boxes • 5. Seismic Potential P(xi) = {(xi)}2 - < {(xi)}2>  Increase in probability for a earthquakes with M  3 averaged over 11 km x 11 km boxes. Symbol <> represents spatial average. • 6. Color code the result. From retrospective studies, we find that P(xi) measures not only the average change in activity during 1990-2000 (triangles at right), but also indicates locations for future activity for the period ~ 2000 to 2010. • Green Triangles: M  5 1990-2000 • (JB Rundle, KF Tiampo, W. Klein, JSS Martins, PNAS, v99, Supl 1, 2514-2521, Feb 19, 2002; KF Tiampo, KF Tiampo, JB Rundle, S. McGinnis, S. Gross and W. Klein, Europhys. Lett., 60, 481-487, 2002). Plot of Log10 (Seismic Potential) for large earthquakes, M  5, ~ 2000 to 2010

  20. Status of the Real Time Earthquake Forecast Experiment (Composite Version) ( JB Rundle et al., PNAS, v99, Supl 1, 2514-2521, Feb 19, 2002; KF Tiampo et al., Europhys. Lett., 60, 481-487, 2002; JB Rundle et al.,Rev. Geophys. Space Phys., 41(4), DOI 10.1029/2003RG000135 ,2003. http://quakesim.jpl.nasa.gov) Fourteen large events (blue circles) with M  5 have occurred on Central or Southern California on anomalies, or within the margin of error (+/- 11 km; Data from S. CA. and N. CA catalogs): After the work was completed 1. Big Bear I, M = 5.1, Feb 10, 2001 2. Coso, M = 5.1, July 17, 2001 After the paper was in press ( September 1, 2001 ) 3. Anza, M = 5.1, Oct 31, 2001 After the paper was published ( February 19, 2002 ) 4. Baja, M = 5.7, Feb 22, 2002 5. Gilroy, M=4.9 - 5.1, May 13, 2002 6. Big Bear II, M=5.4, Feb 22, 2003 7. San Simeon, M = 6.5, Dec 22, 2003 8. San Clemente Island, M = 5.2, June 15, 2004 9. Bodie I, M=5.5, Sept. 18, 2004 10. Bodie II, M=5.4, Sept. 18, 2004 11. Parkfield I, M = 6.0, Sept. 28, 2004 12 Parkfield II, M = 5.2, Sept. 29, 2004 13 Arvin, M = 5.0, Sept. 29, 2004 14 Parkfield III, M = 5.0, Sept. 30, 2004 Note: The original forecast was made using only the Southern California catalog,which does not contain earthquakes from Central and Northern California. This composite plot was made using data from both the Northern California catalog and the Southern California catalog after the San Simeon event occurred. N. Calif. earthquake catalog is used north of dashed line, S. Calif. Catalog is used south of dashed line. Green triangles mark locations of large earthquakes between Jan 1, 1990 – Dec 31, 1999. How are We Doing? (Composite N-S Catalog) CL#03-2015 Plot of Log10(Seismic Potential) Increase in Potential for large earthquakes, M  5, ~ 2000 to 2010 Note (Oct 6, 2004): Two events were removed from the previous scorecard due to erroneous magnitude assignments by USGS-Berkeley (Bodie I, M= 5.0 Jan 21, 2000; Parkfield II, M=5.0, Sept 28, 2004)

  21. Comparison of “Original” and “New Composite” Hotspot Maps As published in Proc. Nat. Acad. Sci. USA, Feb. 22, 2002 Original Map: S. CA catalog plus selective additions from N. CA catalog New Composite Map: S. CA catalog + Full N. CA catalog

  22. How Can We Evaluate Success or Failure? • We have an experiment in which: • Hotspots represent a fraction p of the active area of California • White space represents a fraction q = 1- p • We ask for the probability that, with N large earthquakes,n will land in the hotspot by some random process (“throwing darts”) ?? • This problem is identical to the problem: • An urn filled with balls • A fraction p are white (“hits”) • A fraction q = 1- p are black (“misses”) • We ask for the probability that, by blindly selecting N balls from the urn, n of the balls will turn out to be white. The appropriate probability distribution is the binomial distribution

  23. 1. Conditions: 14 large events, 13 lie on hotspots or within margin of error of +/- 11 km (12 of 14 for original scorecard). Probability of a hit is assumed proportional to percentage of active area of enlarged hotspots ~ .24 (figure as per M. Blanpied, USGS). New Composite Scorecard Original Scorecard 2. Conditions: 14 large events, 9 “hits” within actual hotspots, 5 “misses” (estimates from R. Stein, USGS & JBR). Probability of a hit is assumed proportional to percentage of active area of hotspots ~ .09 (figure as per M. Blanpied). 3. Conditions: 10 mainshock-foreshocks groups of large events. 9 occur in association with an enlarged hotspot. Probability of a hit is assumed proportional to percentage of active active area of enlarged hotspots ~ .24 (figure as per M. Blanpied). Is This Apparent Agreement an Accident? Computing the Probability that the Apparent Agreement in Location is due to Random Chance – 3 Computations A Simplistic Analysis Based on the Binomial Distribution

  24. Oct. 6, 2000, M=7.3 Tottori earthquake Optimal Forecasts of Shallow Earthquake Locations in Japan: Kobe Area JMA Catalog is used. Several variations of the basic method, which differ in the order that operations are applied, are compared below (K. Nanjo, JBR, J Holliday, DL Turcotte, 2004) Forecast for the period: January 1, 2000 ~ December 31, 2010 Parameters used: t0 = Jan. 1, 1970; t1 = Jan. 1, 1974; t2 = Dec. 31, 1999 (Latitudes: 32.9-36.4; Longitudes: 133.0-137.0) Triangles: Large events having M  5 duringt1≤t ≤ t2. Circles: Large events having M  5 during t > t2. Events are used from the surface to 20 km below the surface. All events having M  3 in the catalog are used. Depths down to 20 km only!! Plot of Log10(Seismic Potential) Increase in Potential for large earthquakes, M  5, ~ 2000 to 2010

  25. Optimal Forecasts of Shallow Earthquake Locations in Japan: Tokyo Area JMA Catalog is used. Several variations of the basic method, which differ in the order that operations are applied, are compared below (K. Nanjo, JBR, J Holliday, DLTurcotte, 2004) Forecast for the period: January 1, 2000 ~ December 31, 2010 Parameters used: t0 = Jan. 1, 1980; t1 = Jan. 1, 1990; t2 = Dec. 31, 1999 (Latitudes: 32.9-36.4; Longitudes: 133.0-137.0) Triangles: Large events having M  5 duringt1≤t ≤ t2. Circles: Large events having M  5 during t > t2. Events are used from the surface to 20 km below the surface. All events having M  3 in the catalog are used. Depths down to 20 km only!! Plot of Log10(Seismic Potential) Increase in Potential for large earthquakes, M  5, ~ 2000 to 2010

  26. Summary Numerical simulations like Virtual California can help us understand the system-level physics of earthquakes because: - The effects of complex interactions among faults in the system are difficult to understand without a topologically realistic model - Many scales in space and time are strongly coupled, and comprehensive observations cannot be made that can span all these scales - Simulations allow us to relate the observed space-time earthquake patterns, obtained from seismicity and InSAR observations, to underlying stress and strain Forecasting and prediction: - Is strongly affected by the many complexinteractions among faults in the system - Can be facilitated by using simulations to generate probability density functions for use in developing forecast methods - Hotspot maps can be constructed that indicate likely locations for future earthquakes with M  5, utilizing techniques that originated with methods to forecast El Nino-Southern Oscillation - Real time tests of hotspot maps indicate this forecast technique holds considerable promise

  27. 4 5 1 2 3 How are We Doing? Status of the Real Time Earthquake Forecast Experiment (JB Rundle, KF Tiampo, W. Klein, JSS Martins, PNAS, v99, Supl 1, 2514-2521, Feb 19, 2002; KF Tiampo, KF Tiampo, JB Rundle, S. McGinnis, S. Gross and W. Klein, Europhys. Lett., 60, 481-487, 2002) Left is a color contour plot of changes in Log10{ increase in probability = P } for large earthquakes, computed over the period Jan 1, 1989 to Dec 31, 1999. The colored areas, where P > 0, were computed by our “PDPC” state vector method, and are an indication not only of activity during 1989-1999, but also of future activity in Central & Southern California for the years 2000-2010. Five larger events with M > 5 have occurred in C. or S. CA on anomalies, or within the margin of error (+/- 1 km; Data from SCEC catalog): After the work was completed - 1. Big Bear I, M = 5.1, Feb 10, 2001 After the paper was in press (September 1, 2001) - 2. Anza, M = 5.1, Oct 31, 2001 After the paper was published (February 19, 2002) - 3. Baja, M = 5.7, Feb 22, 2002 4. Gilroy, M=4.9 - 5.1, May 13, 2002 5. Big Bear II, M=5.4, Feb 22, 2003

  28. Aftershocks from the 1906 San Francisco Earthquake Through December 1907 (AJ Meltzner and DJ Wald, BSSA 93, 2160-2186, 2003)

  29. Coefficient of Variation vs. Distance Log10 {Recurrence Interval} vs. Distance Fault Friction vs. Distance  T Coefficient of Variation = = Standard Deviation / Mean Variability of Activity on San Andreas Fault: Simulations (40,000 Years of Activity)

  30. Great NSAF Earthquakes (M>7.4) Large Bay Area Earthquakes (M>6.7) Number of Time Windows Magnitude Number of Large SF Bay Area earthquakes in a Time Window Great EQ Cycle #1 Great EQ Cycle #1 Stacked Great NSAF Cycles Stacked Great NSAF Cycles Great EQ Cycle #2, etc Great EQ Cycle #2, etc T { Precursory Statistics Postseismic Statistics T { 30-Year Window 30-Year Window Unconditional Probabilities from a Model We compute the histogram of Large Bay Area earthquakes occurring in a 30-year window at a time T prior to and after a “great” northern San Andreas earthquake.

  31. Ensemble Forecasts Give Absolute Probabilities (Assume 30-year Time-averaging Windows. “Stress Shadow” or “Regional Stress Recovery” Effect Can be Seen, as well as “Precursory Activation”)

  32. Eastern Mojave Shear Zone Faults LA Basin Faults Application: Search for Phase Locking of Two Fault Groups Can Patterns of Phase Locked Faults Develop?

  33. Magnitude CFF Time Blue: LA Basin Red: Eastern Mojave Shear Zone Activity (Magnitude) and Coulomb Failure Function Time Series

  34. Scaled Moment Release Scaled CFF Time Blue: LA Basin Red: Eastern Mojave Shear Zone Scaled Moment Release and Scaled Coulomb Failure Function Time Series

  35. CFF-CFF Correlations Auto Corr. X Corr. Moment-Moment Correlations Auto Corr. X Corr. Lag Auto Correlations and Cross Correlations for CFF time series (top) and Moment release time series (bottom). These plots provide some limited evidence that the CFF and Moment release time series can become phase locked for periods as long as 10,000 years. Blue: LA Basin Red: Eastern Mojave Shear Zone

  36. How the Hotspots Change Through Time

  37. How the Hotspots Change Through Time

  38. M > 7.5        M > 7.5 M > 7.5    M > 7.5 M > 7.5  30-Year Hazard Function for Simulated San Andreas Events with M > 7.5 M > 7.5

  39. Line of slope = -1 Gutenberg-Richter Relation

  40. Coefficient of Variation vs. Distance Log10 {Recurrence Interval} vs. Distance Fault Friction vs. Distance  T Coefficient of Variation = = Standard Deviation / Mean Variability of Activity on San Andreas Fault: Simulations (40,000 Years of Activity)

  41. M > 0   M > 0   M > 0   M > 0  Statistics of San Andreas Fault Events Over 40,000 Years of Activity We measure the time interval between two successive eventsof any magnitude on the blue segments and red segments of the San Andreas. We plot these statistics both as histograms, and as cumulative distribution functions. The solid lines are the Log Normal probability density functions (top), and Log Normal cumulative distribution functions (bottom) having the same means and variances as the simulation data. The dashed lines are the same for the Brownian Passage Time distribution.

  42. M > 0 M > 0   M > 0     M > 0   Forecasting Simulated San Andreas Events Using the Working Group Method Computing the 30-Year Hazard Function for Events of All Magnitudes Let FT(t) = Prob (t<T). Then the 30-year discrete Hazard Function, which is the expected failure rate over a 30 time span beginning at time T, is defined as: HD(t) = { FT( t+30 ) – FT(t) } / {1 – FT(t) } HD(t) is the conditional probability that failure occurs during the 30 time window, given that it has not occurred prior to the start of the time window: HD(t) = P( T<t<T+30 | t>T) Note that HD(t) computed from the simulation data is qualitatively different from the function computed from the Log Normal and BPT distributions, due to the long tails of the Log Normal and BPT (there are no infinite intervals in the simulations).

  43. M > 7.5        M > 7.5 M > 7.5    M > 7.5 M > 7.5  30-Year Hazard Function for Simulated San Andreas Events with M > 7.5 M > 7.5

  44.  M > 0  M > 0   M > 0  M > 0   10-Year vs. 30-Year Hazard Functions for Simulated Events with M > 0 • Both 10-year and 30-year Hazard Functions look qualitatively similar: • An initial linear increase from elapsed time T = 0 • An intermediate period of relatively constant conditional probability • A final rapid increase to a conditional probability ~ 1.

More Related