1 / 32

Statistics of Seismicity and Uncertainties in Earthquake Catalogs Forecasting Based on Data Assimilation

Statistics of Seismicity and Uncertainties in Earthquake Catalogs Forecasting Based on Data Assimilation. Maximilian J. Werner Swiss Seismological Service ETHZ. Didier Sornette (ETHZ), David Jackson, Kayo Ide (UCLA) Stefan Wiemer (ETHZ). Statistical Seismology.

flower
Download Presentation

Statistics of Seismicity and Uncertainties in Earthquake Catalogs Forecasting Based on Data Assimilation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistics of Seismicity and Uncertainties in Earthquake CatalogsForecasting Based on Data Assimilation Maximilian J. Werner Swiss Seismological Service ETHZ Didier Sornette (ETHZ), David Jackson, Kayo Ide (UCLA) Stefan Wiemer (ETHZ)

  2. Statistical Seismology stochastic and clustered earthquakes uncertain representations of earthquakes in catalogs scientific hypotheses, models, forecasts

  3. Magnitude Fluctuations b=1 Gutenberg-Richter Law Relocated Hauksson Catalog, 1984-2002

  4. Rate Fluctuations 7.1 Hector Mine 1999 7.3 Landers 1992 6.4 Northridge 1994 6.6 Superstition Hills 1987 Rate Triggered Events Days since mainshock Magnitude Relocated Hauksson Catalog, 1984-2002 Omori-Utsu Law Productivity Law

  5. Spatial Fluctuations 7.1 Hector Mine 1999 7.3 Landers 1992 6.4 Northridge 1994 5.4 Oceanside 1986 Relocated Hauksson Catalog, 1984-2002

  6. Seismicity Models simple • Time-independent random (Poisson process) • Time-dependent, no clustering (renewal process) • Time-dependent, simple clustering (Poisson cluster models) • Time-dependent, linear cascades of clusters (epidemic-type earthquake sequences) • non-linear cascades of clusters Current “gold standard” null hypothesis complex

  7. A Strong Null Hypothesis Epidemic-Type Aftershock Sequence (ETAS) model: Ogata (1988, 1998) Gutenberg-Richter Law Omori-Utsu Law Productivity Law + Time-independent spontaneous events + Every earthquake independently triggers events (of any size)

  8. Earthquake forecasts Experimental forecasts for California based on the ETAS model

  9. Effects of Undetected Quakes on Observable Seismicity • why small earthquakes matter • why undetected quakes, absent from catalogs, matter • using a model to simulate their effects • implications of neglecting them Sornette & Werner (2005a, 2005b), J. Geophys. Res.

  10. Magnitude Uncertainties Impact Seismic Rate Estimates, Forecasts and Predictability Experiments • Outline • quantify magnitude uncertainties • analyze their impact on forecasts in short-term models • how are noisy forecasts evaluated in current tests? • how to improve the tests and the forecasts Werner & Sornette (2007), in revision in J. Geophys. Res.

  11. Earthquakes, catalogs and models ? Earthquakes Seismicity Model Measurement process Model parameters ! Earthquake catalog ! Calibrated seismicity model New catalog data ! neglected Forecasts ! exact noisy Evaluation of consistency

  12. Magnitude Noise and Daily Forecasts of Clustering Models Collaboratory for the Study of Earthquake Predictability (CSEP) Regional Earthquake Likelihood Models (RELM) Daily earthquake forecast competition I will focus on random magnitude errors and short-term clustering models

  13. Moment Magnitude Uncertainties CMT vs USGS Distribution of magnitude estimate differences “Hill” plot of scale parameter Laplace distribution:

  14. Short-Term Clustering Models Productivity Law Omori-Utsu Law Gutenberg-Richter Law These 3 laws are used in models by: Vere-Jones (1970), Kagan and Knopoff (1987), Ogata (1988), Reasenberg and Jones (1989), Gerstenberger et al. (2005), Zhuang et al. (2005), Helmstetter et al. (2006), Console et al. (2007), ...

  15. A Simple Cluster Model mainshocks: cluster centers aftershocks: clusters Earthquake rate Noisy magnitudes: centers aftershocks What are the fluctuations of the deviations?

  16. Distributions of Perturbed Rates PDF PDF PDF PDF

  17. Heavy Tails of Perturbed Rates for Survivor function exponent Productivity law of aftershocks Productivity law of aftershocks Noise scale parameter Noise scale parameter Combination of Power law tails Catalog realization Averaging according to Levy or Gauss regime Survivor function

  18. Evaluating Noisy Forecasts Conduct a numerical experiment: • Simulate earthquake “reality” according to our simple cluster model • Make “reality” noisy • Generate forecasts from noisy data • Submit forecasts to mock CSEP/RELM test center • Test noisy forecasts on “reality” using currently proposed consistency tests • Reject models if test’s confidence is 90% (i.e. expect 1 in 10 rejected wrongfully) • Calibrate parameters of the experiment to mimic California How important are the fluctuations in the evaluation of forecasts?

  19. Numerical Experiment Results Level of noise Number of rejected “models” Violates assumed 90% confidence bounds no 0/10 probably 10/60 yes 9/10 yes 7/10 10/10 yes

  20. Implications • Forecasts are noisy and not an exact expression of the model’s underlying scientific hypothesis. • Variability of observations consistent with model are non-Poissonian when accounting for uncertainties. • The particular idiosyncrasies of each model also cannot be captured by a Poisson distribution. • But the consistency tests assume Poissonian variability! • Models themselves should generate the full distribution. • Complex noise propagation can be simulated. • Two approaches: • Simple bootstrap: Sample from past data distributions to generate many forecasts. • Data assimilation: correct observations by prior knowledge in the form of a model forecast.

  21. Earthquake Forecasting Based on Data Assimilation • Outline • current methods for accounting for uncertainties • introduction to data assimilation • how data assimilation can help • Bayesian data assimilation (DA) • sequential Monte Carlo methods for Bayesian DA • demonstration of use for noisy renewal process Werner, Ide & Sornette (2008), in preparation.

  22. Existing Methods in Earthquake Forecasting • The Benchmark: • Ignore uncertainties • Current “strategy” of operational forecasts (e.g. cluster models) • The Bootstrap: • Sample from plausible observations to generate average forecast • Renewal processes with noisy occurrence times • Paleoseismological studies (Rhoades et al., 1994; Ogata, 2002) • The Static Bayesian: • consider entire data set and correct observations by model forecast • Renewal processes with noisy occurrence times • Paleoseismological studies (Ogata, 1999) Generalize to multi-dimensional, marked point processes Use Bayesian framework for optimal use of information Provide sequential forecasts and updates

  23. Data Assimilation • Talagrand (1997): “The purpose of data assimilation is to determine as accurately as possible the state of the atmospheric (or oceanic) flow, using all availableinformation” • Statistical combination of observations and short-range forecasts produce initial conditions used in model to forecast. (Bayes theorem) • Advantages: • General conceptual framework for uncertainties • Constrain unknown initial conditions • Account for observational noise, system noise, parameter uncertainties • Deal with missing observations • Best possible recursive forecast given all information • Include different types of data

  24. Data Assimilation

  25. Bayesian Data Assimilation Unobserved states: Noisy observations: • This is a conceptual solution only. • Analytical solution only available under additional assumptions • Kalman filter: Gaussian distributions, linear model • Approximations: • local Gaussian: extended Kalman filter • ensembles of local Gaussians: ensemble Kalman filter • particle filters: non-linear model, arbitrary evolving distributions Initial condition Model forecast Data likelihood Obtain posterior: Using Bayes’ theorem: Sequentially: Prediction: Update:

  26. Sequential Monte Carlo Methods • flexible set of simulation-based techniques for estimating posterior distributions • no applications yet to point process models (or seismology) particles weights

  27. Temporal Renewal Processes Noise: Renewal process: Forecast: Likelihood (observation): Analysis / Posterior: Werner, Ide and Sornette (2007), in prep

  28. Numerical Experiment Model: Noisy observations: Parameters:

  29. Step 1

  30. Step 2

  31. Step 5

  32. Outlook • Data assimilation of more complex point processes and operational implementation (non-linear, non-Gaussian DA) • Including parameter estimation • Estimating and testing (forecasting) corner magnitude, • based on geophysics, EVT • including uncertainties (Bayesian?) • Spatio-temporal dependencies of seismicity? • Estimating extreme ground motions shaking • Interest in better spatio-temporal characterization of seismicity (spatial, fractal clustering) • Improved likelihood estimation of parameters in clustering models • (scaling laws in seismicity, critical phenomena and earthquakes)

More Related