1 / 25

Jai-Ho Oh Pukyong National University, Busan, Korea

APAN for Meteorological Studies. August 27, 2003. Jai-Ho Oh Pukyong National University, Busan, Korea. Main Goals. Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system. Examples of Weather Information. Metorological Disaster. Industreal Area.

allene
Download Presentation

Jai-Ho Oh Pukyong National University, Busan, Korea

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. APAN for Meteorological Studies August 27, 2003 Jai-Ho Oh Pukyong National University, Busan, Korea

  2. Main Goals Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system

  3. Examples of Weather Information Metorological Disaster Industreal Area Water Resources Environment Wild Fire Health…

  4. Core AgMet Station Koflux/RS reference CAgM Grid Risk management National response Global Environment Flux, Aerosol, GHG uMeteo-K GRID Testbed University Grid Inter-Office Grid Super Ensemble Seasonal/Climate Applied Meteorology Agro,Energy,Fishery, Project Grid Institute Grid Model Development Next G./K-model Meteorological industry Met info/Instruments NGIS Grid KMA Grid Private Grid Impact Assessment Env. Impact/Feedback Public service User req./Detailed APCN-Grid Network Hub for RCC

  5. About uMeteo-K The concept of virtual lab. for interdisciplinary meteorological researches • Parallelized numerical weather prediction modeling (Computational Grid) • - Cooperation meteorological research system (Access Grid) • - Virtual server for large meteorological data (Data Grid) Grid technology is essential to accomplishment

  6. Climate Prediction

  7. Changes in the atmosphere: Composition, chemical reaction rates and circulation Changes in solar input Changes in the hydrological cycle H2O, N2, O2, CO2, O3 etc Terrestrial radiation Air-biomass Coupling (e.g. N C ) Atmosphere Precipitation and evaporation Snow Air-ice coupling Ecosystem changes Heat exchange Aerosols & CO2 etc smoke Wind stress Rivers and lakes Biomass Changes in farming practice Sea ice Human influences Shelf processes Mixed layer processes Land-biomass coupling (e.g. carbon) Land Ocean Changes in the land surface : Continental distribution, vegetation, Land-use, ecosystems Deep Ocean

  8. Required Climate Information by 2010 Climate Information by 2003 Climate Information by 2010 27km 9km 3km

  9. Earth Simulator Massively parallel super-computer based on NEC SX-5 architecture. 640 computational nodes. 8 vector-processors in each of nodes. Peak performance of 1CPU : 8GFLOPS Total peak performance: 8x8x640 = 40TFLOPS

  10. Development of a High-Resolution Atmospheric Global Model on the Earth Simulator for Climate study • 10km or less in horizontal, 100 levels in vertical Nonhydrostatic ICosahedral Atmospheric Model (NICAM)

  11. Integration of Human and Computational Resources • Brain pool • Access Grid System • Supercomputers • Experimental Facilities Chuncheon Seoul Incheon Suwon • Computers Chonan Chongju Daejeon Pohang Jeonju Daegu • Databases • Mass Storage Ulsan Chang- won • Visualization • Environment Kwangju Pusan Cheju • High Speed Networks

  12. Setup uMeteo-K AG with PIG + Room Node basis (ICSYM/PKNU, CES/SNU, NCAM/KMA) - Linkage in uMeteo-K with KOREN network system - Establishment of a duplex video conference system with PIG & Polycom - Establishment of computing environment among uMeteo-K’s PIG (AG toolkit 1.2 version) - Establishment of PIG + PIG based independent Room Node system (NCAM/KMA) AG in uMeteo-K

  13. uMeteo-K AG configuration ANL KMA SNU Quick Bridge Unicast KISTI KISTI KMA KMA Quick Bridge AG KAIST KAIST Unicast PKNU PKNU(부경대) KJIST KJIST CNU CNU Multicast

  14. Samples of uMeteo-K AG operation < Korea AG-Group Quick bridge server test – Participants; PKNU, SNU, KISTI, KJIST, CNU, KAIST, KMA on July 8, 2003 >

  15. < uMeteo-K monthly meeting using VRVS PKNU(Busan)-SNU(Seoul)-KMA(Seoul)-USA(Washington D.C), June 3, 2003>

  16. uMeteo-K CG Testbed • uMeteo-K computational grid testbed • (Two clusters utilized and each cluster has 4 nodes) • < A node’s specification>

  17. uMeteo-K CG Testbed Configuration UPS NAS storage sever 4 nodes ( single CPU ) cluster NAS storage sever 10/100 switch hub 4 nodes ( single CPU ) cluster Monitoring system KOREN 10/100 Ethernet Electrometer

  18. uMeteo-K CG Testbed S/W • Linux : paran 7.0 (kernel version 2.4.18) • Globus 2.4 • PG fortran 3.2 (Portland Group) • MPICH-G2 1.2.5 for parallel job running • MPICH-G2 with PG fortran • NCAR Graphics for graphic display • NIS, NFS

  19. CA-B CA-A slaves slaves Master A Master B PBS PBS Globus linkage between testbed clusters • Independent simple CAhas installed at each master node. • A group of slave nodes is controlled by each master node’s PBS • scheduler

  20. CA information of each cluster - CA-A : pknuGB01.pknu.ac.kr subject : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1/CN=proxy issuer : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1 identity : /O=uMeteoK/OU=pknu.ac.kr/CN=pknuGB1 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u533 timeleft : 10:01:23 - CA-B :pknuGB05.pknu.ac.kr subject : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215/CN=proxy issuer : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215 identity : /O=uMeteoK/OU=PKNU/OU=pknu.ac.kr/CN=pknuCA2215 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u535 timeleft : 10:53:37

  21. Monitoring system on CG testbed Before integration Integration

  22. Globus script file for Parallel MM5 run (mm5.rsl) + ( &(resourceManagerContact="pknuGB01") (count=4) (label="subjob 0") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 0) (LD_LIBRARY_PATH /usr/local/globus/lib/)) (directory="/spring/KISTI/MM5/Run") (executable="/spring/KISTI/MM5/Run/mm5.mpp") ) ( &(resourceManagerContact="pknuGB05") (count=4) (label="subjob 4") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 1) (LD_LIBRARY_PATH /usr/local/globus/lib/)) (directory="/summer/KISTI/MM5/Run") (executable="/summer/KISTI/MM5/Run/mm5.mpp") )

  23. Parallel MM5 Benchmarks with GLOBUS • Average job waiting time (including CA) : 25 sec • The required time for 3600 sec (1 hour) model integration • The required time for 86400 sec (1 day) model integration

  24. uMeteo-K Data Grid Configuration KMA COLA Forecast output NCEP Model output Data input Data input SNU PKNU Forecast output KISTI Supercom Model output Forecast output Data input NASA JMA uMeteo-K Data Grid

  25. Thank you for your attention!

More Related