1 / 40

Stefan Wiemer & Danijel Schorlemmer Swiss Seismological Service ETH Zurich

ZMAP – OpenSHA – OpenSAF?. Stefan Wiemer & Danijel Schorlemmer Swiss Seismological Service ETH Zurich Major contributions by: Edward (Ned) H. Field (USGS). Outline. ZMAP – a 10 year old idea/software for seismicity analysis. OpenSHA: A new concept in Seismic Hazard Assessment.

Download Presentation

Stefan Wiemer & Danijel Schorlemmer Swiss Seismological Service ETH Zurich

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ZMAP – OpenSHA – OpenSAF? Stefan Wiemer & Danijel Schorlemmer Swiss Seismological Service ETH Zurich Major contributions by: Edward (Ned) H. Field(USGS)

  2. Outline • ZMAP – a 10 year old idea/software for seismicity analysis. • OpenSHA: A new concept in Seismic Hazard Assessment. • OpenSAF: Dreaming on …

  3. ZMAP • Developed since 1993 with the intention of providing a GUI based seismicity analysis software. Mostly a research tool. • Described in an Seismological Research Letter article in 2001. • Matlab based, Open Source (about 100.000 lines of codes in ~ 700 scripts). • About 100 – 150 users worldwide, used in about 50 - 70 publications.

  4. ZMAP - capabilities • Standard Tools: Maps, Histograms, cross-sections, Time series etc. • Earthquake catalog quality and consistency. Magnitude shifts, completeness, blast contamination, etc. Real-time potential. • Rate change analysis, mapping of rate changes in space-time. Significance. • b-value analysis, mapping of b as a function of space and time. • Aftershock sequence analysis. Time dependent hazard assessment. • Stress tensor inversion based on focal mechanism data. • Time to failure analysis. • Fractal dimension analysis, mapping of D.

  5. Rate decrease z-value Rate increase

  6. b-values along the SAF: Highly spatially heterogeneous

  7. Example: Mc after Landers Completeness in the hours and days after a mainshock is considerably higher. Could this be improved?

  8. Magnitude of Completeness Mc B A 100 B A 10 0 1 2 3 4 Example: Spatial variability of Mc • Completeness is temporally and spatially highly heterogeneous. • A detailed Mc(x,y,z,t) history should be constructed, maintained by the networks?

  9. 30 4000 25 All 1980 - 1990 20 rate/year 15 1995 - 2000 Cumulative Number 10 2000 0 < M < 1 1000 5 0 0 0.5 1 1.5 2 2.5 3 3.5 Magnitude 0 0 1980 1980 1985 1985 1990 1990 1995 1995 2000 2000 2005 2005 Time Example: Parkfield magnitude shift? • What happened around 1995 to the catalog of the Parkfield section of the San Andreas fault? • Catalogs should be monitored routinely in the future to detect man-made (and natural) transients early on.

  10. ZMAP – what worked well • Matlab based: Efficient development, expandable, widely available, largely platform independent. • Addresses a definite need in the seismological community. • Nice research tool for those who know how to use it.

  11. ZMAP – limitations • Too complex. Not stable enough. • No systematic users support (lately: Very limited support). • No dedicated financial support to develop and maintain the software. • Difficult to embed other codes (wrappers work sort of, e.g., stress tensor inversions). • Does not work in parallel mode.

  12. ZMAP – summary • Has reached the end of its lifecycle? • What would a new generation seismicity analysis software do? • Can we make it GRID based? (Simulations can take days to weeks) • Can we make it object oriented?

  13. Creating a Distributed, Community-Modeling Environmentin Support of the Working Group for the Development ofRegional Earthquake Likelihood Models(RELM) Edward (Ned) H. Field(USGS)&Thomas H. Jordan(USC)

  14. OpenSHAA Developing, DistributedCommunity-Modeling Environment for Seismic Hazard Analysis Design Criteria: open source, web enabled, & object oriented. Implementation: Java & XML, although the framework is programming-language independent, and some components will be “wrapped” legacy code (e.g., WG99 Fortran code).

  15. Source + Attenuation + Site = Hazard

  16. Seismic Hazard Analysis Earthquake-Rupture Forecast Probability in time and space of all M≥5 ruptures (2)Ground-Motion Model “Attenuation Relationships” Full waveform modeling

  17. OpenSHA Web Site: http://www.OpenSHA.org SHA Framework: SRL submission (Field, Jordan, & Cornell) Design Evaluation: SCEC Implementation Interface Code Development : Ned Field, Sid Hellman, Steve Rock, Nitin Gupta, & Vipin Gupta Validation: PEER Working-Group Test Cases

  18. OpenSHA Objects Time Span Desired output is the probability that something of concern will happen over a specified time span Earthquake- Rupture Forecast Generates Rupture Sources IM Rupn,i Site Type, Level Each Source has N Earthquake Ruptures Sourcei Intensity-Measure Relationship Probability of occurrence

  19. OpenSHA Objects • Intensity-Measure Type/Level • a specification of what the analyst (e.g., engineer) is worried about Time Span Earthquake- Rupture Forecast Generates Rupture Sources IM Rupn,i Site Type, Level Each Source has N Earthquake Ruptures Sourcei Intensity-Measure Relationship Probability of occurrence

  20. OpenSHA Objects • Site&Prob. Eqk Rupture • The two main physical objects used in the analysis Time Span Earthquake- Rupture Forecast Generates Rupture Sources IM Rupn,i Site Type, Level Each Source has N Earthquake Ruptures Sourcei Intensity-Measure Relationship Probability of occurrence

  21. OpenSHA Objects • Intensity-Measure Relationship • One of the major model component (a variety available or being developed). Time Span Earthquake- Rupture Forecast Generates Rupture Sources IM Rupn,i Site Type, Level Each Source has N Earthquake Ruptures Sourcei Intensity-Measure Relationship Probability of occurrence

  22. OpenSHA • Eqk Rupture Forecast • The other main model components (A variety being developed in RELM). Time Span Earthquake- Rupture Forecast Generates Rupture Sources IM Rupn,i Site Type, Level Each Source has N Earthquake Ruptures Sourcei Intensity-Measure Relationship Probability of occurrence

  23. Web-Based Tools for SHA: Time Span Earthquake- Rupture Forecast List of Adjustable Parameters Intensity-Measure Relationship List of Supported Intensity-Measure Types List of Site-Related Independent Parameters Site Location List of Site- Related Parameters Intensity Measure Type & Level (IMT & IML) Hazard Calculation Prob(IMT≥IML)

  24. Time Span Source List Network Earthquake Catalog Fault Activity Database Earthquake Forecast GPS Data (Velocity Vectors) Historical Earthquake Catalog Community Fault Model

  25. OpenSHA We want the various models and community databases to reside at their geographically distributed host institutions, and to be run-time accessible over the internet. This is an absolute requirement for making the community modeling environment both usable and manageable.

  26. OpenSHA Building this distributed, community-modeling environment raises several issues that we don’t presently know how to deal with: The distributed system must be easy to use, which means hiding details as much as possible. Analysis results must be reproducible, which means something has to keep track of all those details. Computations must be fast, as web-based users aren’t going to want to wait an hour for a hazard map or synthetic seismograms. We’ll need a mechanism for preventing erroneous results due to unwitting users plugging together inappropriate components.

  27. This will help reduce the time to generate a hazard map, or a synthetic seismogram, from hours to (hopefully) seconds. The SCEC ITR collaboration is helping: (a few examples and lots of $$$$$) • Grid Computing: • To enable run-time access to whatever high performance computing resources are available at that moment.

  28. The SCEC ITR collaboration is helping: (a few examples) • Knowledge Representation and Reasoning (KR&R): • To keep track of the relationships among components, and to monitor the construction of computational pathways to ensure that compatible elements are plugged together.

  29. The SCEC ITR collaboration is helping: (a few examples) • KR&R and Digital Libraries: • To enable smart eDatabase inquiries • (e.g., so code can construct an appropriate probability model for a fault based on the latest information found in the fault activity database).

  30. The SCEC ITR collaboration is helping: (a few examples) • Digital Libraries: • To enable version tracking for purposes of reproducibility in an environment of continually evolving models and databases.

  31. OpenSHAA Community-Modeling Environment for Seismic Hazard Analysis An infrastructure for developing and testing arbitrarily complex (physics based; system level) SHA components, while putting minimal constraints on (or additional work for) the scientists developing the models. Provides a means for the user community to apply the most advanced models to practical problems (which they cannot presently do). (summary)

  32. including exact object definitions and a library of Java classes that others might find useful OpenSHA More info available at: http://www.OpenSHA.org

  33. What can we learn from OpenSHA for ZMAP? Back to good old Europe…

  34. NERIS offered an opportunity N6 - Task B. Building the foundation for a community based Seismicity Analysis Framework (OpenSAF). The information contained in modern earthquake data sets is currently exploited by seismologists using a variety of independent tools (e.g., SSLib, ZMAP, Wizmap, GMT, Slick, Coulomb 2.2) which have no interoperability or standardization. Better and more efficient exploitation of this information requires integrating set of modern, interactive, easy-to-use and accessible tools for visualization, quality assessment, data mining, statistical modeling, quantitative hypothesis evaluation and many other tasks. Such integration could be provided by a seismic data analysis framework (OpenSAF) - a centralized, Internet ready platform for accessing visualization and analysis tools. OpenSAF would be designed to interoperate closely with OpenSHA.

  35. The Future • I learned: I am more objective-oriented, not object-oriented. • Developing OpenSAF in Java (or similar) would, in our opinion, be a laudable objective; however, it would require a sustained effort and significant financial support. Is it worth it in this case? Or should we stick to a high level language? • Where could the support come from? How can one make it a community-supported, sustainable effort?

  36. The Future • The alternative might be a new, modular, Matlab based research program that avoids the mistakes of the old ZMAP, and the ability to build stand-alone, streamlined modules for specific tasks (monitoring of completeness, rate changes, artifacts …). A ‘license fee’ from users that raises about 1 man-year might be feasible. The End

More Related