1 / 38

EELA is a project funded by the European Union under contract 026409

E-Infraestructure shared between Europe and Latin America. GRID distributed computation of nested climate simulations. The project. www.eu- EELA .org. EELA is a project funded by the European Union under contract 026409. A. S. Cofiño (1), M. Carrillo (2), C. Baeza (3),

gefen
Download Presentation

EELA is a project funded by the European Union under contract 026409

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. E-Infraestructure shared between Europe and Latin America GRID distributed computation of nested climate simulations. The project www.eu-EELA.org EELA is a project funded by the European Union under contract 026409 A. S. Cofiño (1), M. Carrillo (2), C. Baeza (3), J. Fernández (1), R. M. San-Martín (2), R. Abarca (3) and J. M. Gutiérrez (1) on behalf of the EELA team. (1) Dept of applied mathematics and computing sciences. University of Cantabria. Spain, (2) Servicio Nacional de Meteorología e hidrología. Peru, (3) Universidad de Concepción. Chile

  2. GRID Computing Applications draw computing power from a Computational Grid in the same way electrical devices draw power from an electrical grid

  3. GRID Computing • Developed in the mid-90 • Use of distributed, heterogeneous, dynamic and, usually, parallel computational resources. • Middleware and standard software to build applications (Globus Toolkit, OGSA, …) • Several research projects (and comercial products) developing this technology. Applications draw computing power from a Computational Grid in the same way electrical devices draw power from an electrical grid

  4. EELA Goals E-infrastructure shared between Europe and Latin America • Goal: To build a bridge between consolidated e-Infrastructure initiatives in Europe and emerging ones in Latin America. • Objectives: • Establish a human collaboration network between Europe and Latin America • Setting a pilot e-infrastructure in Latin America • Identifying and promoting a sustainable framework for e-Science in Latin America

  5. EELA structure EELA is structured in four Working Packages: • WP1. Project administrative and technical management • WP2. Pilot testbed operation and support • WP3. Identification and support of Grid-enhanced applications • Task 3.1. Biomed Applications • Task 3.2. High Energy Physics Applications • Task 3.3. Additional Applications: • E-Learning • Climate • WP4. Dissemination activities

  6. EU Spain: CIEMAT, CSIC, UPV, RED.ES, UC Italy: INFN Portugal: LIP Latin America Venezuela: ULA Cuba: CUBAENERGIA Chile: UTFSM, REUNA, UDEC Peru: SENAMHI Mexico: UNAM Argentina: UNLP Brazil: UFRJ, CNEN, CECIERJ/CEDERJ, RNP, UFF International CLARA CERN Partners

  7. Available resources

  8. WP 3.3. Climate Task GOAL:Predict local impacts of “El Niño” in Latin America A challenging problem for the climate community, with huge socio-economical impact in Latin America.GRID helps toshare computing resources, heterogeneous data, as well as know-how in a user-friendly form.A newintegrated climate application developed in EELA from the scratch, with no similar counterpart in any other Earth Science/Climate EU Project. Anomalous heating Anomalous cooling Madrid, First EELA Annual Review, 27.02.2007

  9. CAM & WRF The Community Atmosphere Model (CAM) and the Weather Research and Forecasting (WRF) models are state-of-the-art atmosphere (global and regional) models developed at NCAR. Output format: NetCDF CAM: FORTRAN90 system() calls were introduced into the model to make it move data and status info to the Grid catalogs (LFC and AMGA) WRF: The model does not yet interact with the middleware. Although it runs in the Grid testbed, it is not yet coupled to CAM.

  10. Computational challenge Challenging computational problem with nontrivial dependent relationships among the applications. A cascade of dynamic dependent jobs is adopted. The cascade of applications interacts with the middleware to: • Prepare and submit dependent jobs. • Store and retrieve the generated data sets (data sharing). • Manage metadata (for the data sets and application status). • Restarting broken experiments SOM WRF (par 1) CAM SOM WRF (par 2) … … SOM WRF (par n) SST + other forcings SE SE SE SE SE SE SE

  11. UI Upload info Insert entry in WRF collection AMGA Update status Update status Insert runon CAM collection AMGA Restart checkpoint Insert entry in CHECKPOINT collection AMGA Updatehistory Insert entry in HISTORYCAM collection AMGA CAM Application Workflow Resource Broker submiter 1 Computing element Generate CAM.jdl 2 3 Submit CAM.jdl CAM Insert entry in CAM collection AMGA

  12. Info & Data Flow LFC DATA File catalog CAM SE Metadata Storage Element AMGA Metadata Catalog status information WRF

  13. UI Info & Data Flow LFC DATA CAM SE Metadata Portal AMGA status information WRF If a submitted job is not running, the User queries AMGA about whether it was successful. If not, the User checks what was the last restart file and restarts the CAM job. While a CAM job is running, the User queries AMGA about the data sets produced by CAM and then triggers the WRF jobs. And so on with the SOMs.

  14. UI Info & Data Flow (future) LFC DATA CAM SE Metadata coordinator Portal AMGA status information WRF R-GMA Relational Grid Monitoring Architecture A coordinator application (unattended and event-driven) should care for successful job completion. To be implemented!

  15. Current status Using GENIUS to interact with the applications (CAM+WRF). In the future a climate specific portal will be developed (JSR168) to run and track scientific experiments.

  16. The CAM namelist needs to be prepared and provided here by the user. (not very user-friendly, yet)

  17. &camexp absems_data = ‘lfn:/grid/eela/.../abs_ems_factors.nc‘ aeroptics = ‘lfn:/grid/eela/.../AerosolOptics.nc‘ bnd_topo = ‘lfn:/grid/eela/.../topo-from-cami.nc‘ bndtvaer = ‘lfn:/grid/eela/.../AerosolMass.nc' bndtvo = 'lfn:/grid/eela/.../pcmdio3.nc‘ bndtvs = 'lfn:/grid/eela/.../sst_HadOIBl.nc' caseid = 'nino82d' iyear_ad = 1982 start_ymd = 19820101 ncdata = 'lfn:/grid/eela/.../cami.nc‘ mfilt = 1,4,1 ...

  18. Currently, the regions are prepared off-line and uploaded to the LFC.

  19. All CAM simulations are available for regionalization with WRF, but the coupler CAM -> WRF is NOT yet implemented

  20. However, we have other input data sets for WRF available in the catalog.

  21. The simulations finished provide access to the results

  22. The output file can be downloaded in NetCDF format or accessed via a THREDDS or OpenDAP aware application.

  23. For instance, the toolsUI java application from Unidata can load the OpenDAP (DODS) address and access only the requested portions of the data

  24. NOAA provides user-friendly access to setup regional domains for WRF

  25. This Java Web Start application could in the future be launched from our web portal as starting point to design a regional simulation

  26. Conclusions • The EU-funded EELA project aims at establishing an e-infrastructure and scientific collaboration between european and Latin American countries. • Within the climate task, we have implemeted a sequence of climate applications CAM+WRF(+SOM) which runs integrated in a common Grid providing regional simulations for a given SST and other forcings. • The applications interact with the middleware uploading data and status information to the Grid catalogs (LFC and AMGA) which can easily be accessed through a web portal. • There are still computing issues (CAM-WRF coupler, simulation restarts, coordinator application, more user-friendly model setup) to be solved prior to enter scientific production.

  27. References CAM Application: http://www.ccsm.ucar.edu/models/atm-cam/ User’s Guide to the NCAR Community Atmosphere Model (CAM 3.0): J. R. McCaa, M. Rothstein, B. E. Eaton, J. M. Rosinski, E. Kluzek, M. Vertenstein: Climate And Global Dynamics Division, NCAR, Boulder, Colorado, 2004, 88 pp. W. D. Collins, C. M. Bitz, et al. (2006) “The Community Climate System Model: CCSM3”, Journal of Climate, Special Issue on CCSM, 19(11). WRF Application: http://www.wrf-model.org/ Skamarock, W. C., J. B. Klemp, J. Dudhia, D. O. Gill, D. M. Barker, W. Wang and J. G. Powers, 2005: A Description of the Advanced Research WRF Version 2. NCAR Technical note, 2005, 88 pp. Michalakes, J., J. Dudhia, D. Gill, T. Henderson, J. Klemp, W. Skamarock, and W. Wang, 2004: "The Weather Reseach and Forecast Model: Software Architecture and Performance,"Proceedings of the 11th ECMWF Workshop on the Use of High Performance Computing In Meteorology, 25-29 October 2004, Reading U.K. Ed. George Mozdzynski. SOM Application (grid version): http://www.meteo.unican.es F. Luengo, A.S. Cofiño, and J.M. Gutiérrez (2004) “GRID Oriented Implementation of Self-Organizing Maps for Data Mining in Meteorology”, Lecture Notes in Computer Science, 2970, 163 – 171.

  28. Thanks for your attention!

  29. Survival guide for reading Grid documents AMGA : ARDA Metadata Grid App. ARDA : A Realisation of Distributed Analysis for LHC CE : Computing element CLARA : Cooperación latino-americana de redes avanzadas EGEE : Enabling Grids for E-sciencE GGF : Global grid forum GIIS : Grid index info service GILDA : Grid INFN laboratory for dissemination activities GMA : Grid monitoring architecture GRAM : Grid res. alloc. manager GRIS : Grid resource info service GSI : Grid security infrastructure INFN : Istituto nazionale di Fisica nucleare JDL : Job description language LCG : LHC computing grid LDAP : Lightweight directory access protocol LFC : Logical file catalog LHC : Large Hadron Collider MDS : Monitoring & discovery system NREN : National Research and Education Network OGSA : OpenGrid services architecture PKI : Public key infrastructure RB : Resource broker R-GMA : Relational GMA SE : Storage element VOMS : Virtual organization membership service WS : Web service Acronyms

  30. The Earth System Grid LBNL HPSS High Performance Storage System disk ANL openDAPg server CAS Community Authorization Services CAS-enabled Striped-gridFTP server gridFTP Striped gridFTP client CAS-enabled Striped-gridFTP server SRM Storage Resource Management gridFTP gridFTP server gridFTP openDAPg server MyProxy server NCAR disk CAS-enabled Striped-gridFTP server GRAM gatekeeper MyProxy client openDAPg server CAS client TOMCAT Servlet engine MCS client LLNL RLS client ORNL SRM Storage Resource Management gridFTP server gridFTP server gridFTP server gridFTP gridFTP LAS Live Access Server SRM Storage Resource Management ISI SRM Storage Resource Management MCS Metadata Cataloguing Services SOAP HPSS High Performance Storage System RLS Replica Location Services RMI MSS Mass Storage System disk disk

  31. WP 3.3. Climate Demo… WRF. Regional Simulation It runs in any site of the production testbed: UFRJ CAM. Global simulation Requires metadata catalog so it runs In: UC and CIEMAT (EU), UFRJ (LA).

  32. Scientific Challenge … The second year will focus on a scientific challenge:High resolution regional simulations for El Niño and La Niña 1982-1983 and 1997-98 strong events. Comparison with historical local data, including sensitivity studies of SST. SST SE WRF (job 1) CAM SOM WRF (job 2) … SOM WRF (job n) Madrid, First EELA Annual Review, 27.02.2007

  33. References CAM Application: http://www.ccsm.ucar.edu/models/atm-cam/ User’s Guide to the NCAR Community Atmosphere Model (CAM 3.0): J. R. McCaa, M. Rothstein, B. E. Eaton, J. M. Rosinski, E. Kluzek, M. Vertenstein: Climate And Global Dynamics Division, NCAR, Boulder, Colorado, 2004, 88 pp. W. D. Collins, C. M. Bitz, et al. (2006) “The Community Climate System Model: CCSM3”, Journal of Climate, Special Issue on CCSM, 19(11). WRF Application: http://www.wrf-model.org/ Skamarock, W. C., J. B. Klemp, J. Dudhia, D. O. Gill, D. M. Barker, W. Wang and J. G. Powers, 2005: A Description of the Advanced Research WRF Version 2. NCAR Technical note, 2005, 88 pp. Michalakes, J., J. Dudhia, D. Gill, T. Henderson, J. Klemp, W. Skamarock, and W. Wang, 2004: "The Weather Reseach and Forecast Model: Software Architecture and Performance,"Proceedings of the 11th ECMWF Workshop on the Use of High Performance Computing In Meteorology, 25-29 October 2004, Reading U.K. Ed. George Mozdzynski. SOM Application (grid version): http://www.meteo.unican.es F. Luengo, A.S. Cofiño, and J.M. Gutiérrez (2004) “GRID Oriented Implementation of Self-Organizing Maps for Data Mining in Meteorology”, Lecture Notes in Computer Science, 2970, 163 – 171. Madrid, First EELA Annual Review, 27.02.2007

More Related