1 / 10

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO. Panel questions: 1 response per question Limit length to 1 slide. Panel Format. <insert panel format here> Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it

nasia
Download Presentation

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IDC HPC USER FORUM Weather & Climate PANELSeptember 2009Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide

  2. Panel Format • <insert panel format here> • Sequence – Alphabetical • Few bullet points for each question, each participant can address/supplement it • After each panel member has finished it, we move on to the next question • Moderators can adjust depending on discussions and time constraints

  3. Panel Members • <insert panel moderators here> • Steve Finn & Sharan Kalwani • <insert panel participants & affiliation here>

  4. Q1. Relative Importance of data/resolution/micro-physics! • Please quantify the relative importance of  improvements in observational data,  grid resolution, cloud micro-physics for future forecast accuracy ? • For prediction • Observations and understanding of observations • Data assimilation • Ensembles • Physics • Scale appropriate • Sensitivities • Superparameterizations • Resolution • Explicitly resolve scales • Convergence studies, feed back to prediction

  5. Q2. Adaptive Mesh or Embedded Grids: their impact… • Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as  system interconnects? • Nesting • Domains interact sequential • Scatter-gather 3D fields between domains • Spatial refinement • In place, adding cells • Temporal refinement (future) • Adaptivity (future) • Coupling • Load balancing, bandwidth

  6. Q3. Ratio of Date to Compute: Background… • What are your Bytes per Flop for future requirements? • Assuming the question means “bytes of main memory per sustained flop/s” (D. H. Bailey) • Current – lots of headroom ~2000 ops per cell per second ~800 bytes (4 byte floats) per cell = 0.4 bytes per op • Future • Resolution follows 3/4 rule (2/3 in practice) • Adding physics or chemistry should not upset this ratio * This is a relatively  *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart)

  7. Q4. Open Source codes in the community… • What is the  Importance and impact of open source / community code  applications such as CCSM, WRF,….? • Common modeling tool to foster interaction, outreach, and ultimately advancement of the science • Relevant HPC application benchmarks

  8. Q5. Data and collaboration, formats, future needs… • What is the level of collaboration and standardization of data management, observational & results data bases: such as use  of common file formats, web based data, etc.  What is needed in the future? • Scientific and technical interoperability for multi-model simulation systems • Metadata formalisms, conventions, infrastructure: • Earth System Curator (www.earthsystemcurator.org) • Earth System Grid (www.earthsystemgrid.org)

  9. Q6. Ensemble model: your experiences… • Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? • Ensembles have a positive effect on scaling because they are trivially scalable

  10. Q7. Obligatory Question: (no pun intended!)Cloud computing: your views (unfiltered)… • What  is your current / future interest in grid or cloud computing ? • Computational grids are not feasible for tightly coupled parallel applications • Reproducibility across platforms also an issue • Data and observing grids are useful • WRF is used in LEAD (portal.leadproject.org)

More Related