1 / 35

A community modeling environment: geodynamic integration of multi-scale geoscience data

A community modeling environment: geodynamic integration of multi-scale geoscience data. Mian Liu 1 , Huai Zhang 1,2 , Youqing Yang 1 , Qingsong Li 1 , Yaolin Shi 2 1-University of Missouri-Columbia 2-Computational Geodynamic Lab, CAS, China. Motivation 1:.

kyrie
Download Presentation

A community modeling environment: geodynamic integration of multi-scale geoscience data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A community modeling environment: geodynamic integration of multi-scale geoscience data Mian Liu1, Huai Zhang1,2, Youqing Yang1, Qingsong Li1, Yaolin Shi2 1-University of Missouri-Columbia 2-Computational Geodynamic Lab, CAS, China www.geongrid.org

  2. Motivation 1: Exponentially increase of multi-scale observational data that need to be integrated and interpreted within a self-consistent geodynamic framework www.geongrid.org

  3. EarthScope Instruments www.geongrid.org

  4. EarthScope Annual Data Volume Data volumes over next 10 years • GPS: ~7.7 TB • BSM/LSM: ~10.5 TB • Seismic: • ~120 TB www.geongrid.org

  5. Multi-timescales of geoscience data www.geongrid.org

  6. Motivation 2: Advance of computer hardware (especially PC clusters and grid computers) and software engineering have provided unprecedented computing power; Data infrastructure have made integrating multiscale data both easy and necessary. www.geongrid.org

  7. So we built the data cyberinfrastructures, now what? HPCC Physical model Internet GEON Data Grid www.geongrid.org

  8. Free scientists from coding to do science, or whatever they do best HPCC Physical model Internet Data Grid www.geongrid.org

  9. Earth Simulator-GeoFEM project Geoframwork QuakeSim SCEC Community Modeling Environment CIG (Computational Infrastructure for Geodynamics ) Some current efforts on geodynamic computations www.geongrid.org

  10. Develop specific type of models (e.g., mantle convection); Use plug-in modules in a general system to generate specific type of models (wave, fluid, structure, etc.) More than one way to do it … www.geongrid.org

  11. Example: Earth Simulator- GeoFEM project http://geofem.tokyo.rist.or.jp Multi-Purpose/Multi-physics Parallel FE Simulator/Platform for Solid Earth Different finite element model can be plugged into this system www.geongrid.org

  12. Not all geological needs can fit into the pigeonholes; Need integration with data CI; Scalable for parallel and grid computation Won’t it be nice if we can have a general, flexible community modeling system? Wouldn’t it be nice if all (or most of) these can be automated?  www.geongrid.org

  13. PED2D (http://members.aol.com/pde2d) FEPG (Finite Element Program Generator) (http://www.fegensoft.com/english/index.htm) Examples of commercial FE code generation systems www.geongrid.org

  14. The devil is in the details … Geometric modeling of real application Blas,blacs,pblas, splib,fepglib, AZsolv, MUMPS Arpack, SuperLU Sequential solvers Krylov subspace iterative solvers and preconditioners FEM meshes generator Graph partition (Metis4.0, Pmetis3.0) and Data partition User level communication protocol MPI Data structure and optimization Common Gateway Interface via www browsers for users PFEPG Server administration tools Server-Client communication system PDE2FEM system Dynamic load-balancing of each node in parallel computer LMDDM and LMDDA algorithm kernel subroutines GES,PDE,CDE, SDE etc. element subroutine generators GCN, NFE etc. nonlinear algorithms generators Libs for PDEs, shape functions and other software packages Theoretical and application documents for users www.geongrid.org

  15. Automated Code Generator Step 1: From PDE expression to Fortran segments disp u v coor x y func funa funb func shap %1 %2 gaus %3 mass %1 load = fu fv $c6 pe = prmt(1) $c6 pv = prmt(2) $c6 fu = prmt(3) $c6 fv = prmt(4) $c6 fact = pe/(1.+pv)/(1.-2.*pv) func funa=+[u/x] funb=+[v/y] func=+[u/y]+[v/x] stif dist = +[funa;funa]*fact*(1.-pv) +[funa;funb]*fact*(pv) +[funb;funa]*fact*(pv) +[funb;funb]*fact*(1.-pv) +[func;func]*fact*(0.5-pv) Segment 1 *es,em,ef,Estifn,Estifv, variables Segment 2 *es(k,k),em(k),ef(k),Estifn(k,k),Estifv(kk), goto (1,2), ityp 1 call seuq4g2(r,coef,prmt,es,em,ec,ef,ne) goto 3 2 call seugl2g2(r,coef,prmt,es,em,ec,ef,ne) goto 3 3 continue Segment 3 DO J=1,NMATE PRMT(J) = EMATE((IMATE-1)*NMATE+J) End do PRMT(NMATE+1)=TIME PRMT(NMATE+2)=DT prmt(nmate+3)=imate prmt(nmate+4)=num equation Segment 4 Other element matrix computing Subs PDE expression Contains information of the physical model, such as variables and equations for generating element stiffness matrix. Fortran Segments codes that realize the physical model at element level. www.geongrid.org

  16. Step 2: From algorithm expression to Fortran segments defi stif S mass M load F type e mdty l step 0 equation matrix = [S] FORC=[F] SOLUTION U write(s,unod) U end do i=1,k do j=1,k estifn(i,j)=0.0 end do end do do i=1,k estifn(i,i)=estifn(i,i) do j=1,k estifn(i,j)=estifn(i,j)+es(i,j) end do end do Stiffness matrix Segment 5 U(IDGF,NODI)=U(IDGF,NODI) *+ef(i) Segment 6 Algorithm Expression Contains information for forming global stiffness matrix for the model. Fortran Segments codes that realize the physical model at global level. www.geongrid.org

  17. Step 3: Plug Fortran segments into a stencil, forming final FE program Fortran Segments generated Program Stencil SUBROUTINE ETSUB(KNODE,KDGOF,IT,KCOOR,KELEM,K,KK, *NUMEL,ITYP,NCOOR,NUM,TIME,DT,NODVAR,COOR,NODE, #SUBET.sub *U) implicit double precision (a-h,o-z) DIMENSION NODVAR(KDGOF,KNODE),COOR(KCOOR,KNODE), *U(KDGOF,KNODE),EMATE(300), #SUBDIM.sub *R(500),PRMT(500),COEF(500),LM(500) #SUBFORT.sub #ELEM.sub C WRITE(*,*) 'ES EM EF =' C WRITE(*,18) (EF(I),I=1,K) #MATRIX.sub L=0 M=0 I=0 DO 700 INOD=1,NNE ……… U(IDGF,NODI)=U(IDGF,NODI) #LVL.sub DO 500 JNOD=1,NNE ……… 500 CONTINUE 700 CONTINUE ……… return end Segment 1 Segment 2 Segment 4 Segment 3 Segment 5 Segment 6 ………….. www.geongrid.org

  18. Examples Western US tectonics Deformation of Asian continent Stress evolution and strain localization in the San Andreas Fault www.geongrid.org

  19. A Preliminary Finite Element Model of Active Crustal Deformation in the Western US www.geongrid.org

  20. www.geongrid.org

  21. The Power of GEON Cluster Node Preliminary parallel model (16-nodes, 32 CPUs) Original series model (single CPU) (x 40vertical topographic exaggeration) • More than 800,000 unstructured elements • Major Faults and more deformation zones • Subduction of Juan de Fuca slab • 21 layers in R-direction • Less than 3000 elements • Three layers in R-direction • 2 min for per time step www.geongrid.org

  22. Automatic domain decomposition for parallel computing www.geongrid.org

  23. www.geongrid.org

  24. The model now allows simulation of large scale continental deformation with unprecedented detail www.geongrid.org

  25. The model now allows simulation of large scale continental deformation with unprecedented detail www.geongrid.org

  26. Ongoing Effort:Toward a new 3D model of continental deformation in Asia www.geongrid.org

  27. Predicted surface shear stress www.geongrid.org

  28. Loading the San Andreas Fault by relative PA-NA motion • Fully 3D • Dynamic • Plastic-viscoelastic • Co-seismic/interseismic cycles • from seconds to 104 years • Parallel computing on PC clusters www.geongrid.org

  29. Comparison of predicted surface velocity and GPS data www.geongrid.org

  30. Predicted maximum shear stress www.geongrid.org

  31. www.geongrid.org

  32. Predicted rate of plastic strain energy release outside the SAF www.geongrid.org

  33. Integrating the community modeling environment with the geoscience data cyberinfrastructure; Grid computation and data integration; Automated (optimized?) work flow management (the Kepler system?) Dream on… www.geongrid.org

  34. SWF SWF Complete source code Data =>??? Physical model PDEs FEM Modeling Language func funa=+[u/x] ……… funf=+[u/y]+[v/x] ……… dist =+[funa;funa]*d(1,1)+[funa;funb]*d(1,2)+[funa;func]*d(1,3) +[funb;funa]*d(2,1)+[funb;funb]*d(2,2)+[funb;func]*d(2,3) +[func;funa]*d(3,1)+[func;funb]*d(3,2)+[func;func]*d(3,3) +[fund;fund]*d(4,4)+[fune;fune]*d(5,5)+[funf;funf]*d(6,6) load = +[u]*fu+[v]*fv+[w]*fw-[funa]*f(1)-[funb]*f(2)-[func]*f(3) -[fund]*f(4)-[fune]*f(5)-[funf]*f(6) Data Grid (GEON and others) Automatic source code generator Model results HPCC www.geongrid.org

  35. Thank you! www.geongrid.org

More Related