1 / 10

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO. Panel questions: 1 response per question Limit length to 1 slide. Panel Format. <insert panel format here> Sequence – Alphabetical Few bullet points for each question, each participant can address/supplement it

parker
Download Presentation

IDC HPC USER FORUM Weather & Climate PANEL September 2009 Broomfield, CO

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IDC HPC USER FORUM Weather & Climate PANELSeptember 2009Broomfield, CO Panel questions: 1 response per question Limit length to 1 slide

  2. Panel Format • <insert panel format here> • Sequence – Alphabetical • Few bullet points for each question, each participant can address/supplement it • After each panel member has finished it, we move on to the next question • Moderators can adjust depending on discussions and time constraints

  3. Panel Members • <insert panel moderators here> • Steve Finn & Sharan Kalwani • <insert panel participants & affiliation here>

  4. Q1. Relative Importance of data/resolution/micro-physics! • To a certain degree, data assimilation, resolution and physics are all important (non-linear system of eqns.) • Metric dependent: quantitative precipitation skill, low-level winds, clouds, forecast length (nowcast vs climate) • For typical Navy metrics (winds,visibility,waves,clouds) • Data assimilation is essential (accurate synoptic and mesoscale initial state, spin-up of physics) • Physics: Boundary, surface layer, cloud/convection • Resolution • Sufficient to capture key geographic features • High enough to avoid bulk parameterizations Convection (Dx~2-4 km), turbulence (Dx~20-200 m) • Predictability: tradeoffs between ensembles & Dx

  5. Q2. Adaptive Mesh or Embedded Grids: their impact… • Please discuss the use of Adaptive Mesh or embedded grids ( urban area, terrain impact on advection..) and how would future increased use of this, impact system requirements such as  system interconnects? • Adaptive meshes are challenging • Time step and run time (for operations) issues • Physical parameterizations (resolution dependence) • Mesh refinement needs to consider complex multi-scale interactions (difficulty in determining hi-res areas). • Nested grids currently used in Navy mesoscale model • Moving meshes to follow features (hurricanes), ships • Impact on system requirements (interconnects) • Load balance may be an issue (decomposition) • Cores as a function of grid points (communication)

  6. Q3. Ratio of Date to Compute: Background… Problem dimension: nNest·nx·ny·nz·nVariables·nTimeLevels·Precision Today: 5x100x100x50x50x3x4 Future: 5x100x100x100x100x3x8 • What are your Bytes per Flop for future requirements? * This is a relatively  *low* ratio compared to some other benchmarks? Geerd Hoffman of DW said 4Bytes/Flop @ the Oct 2007 HPC User Forum in Stuttgart)

  7. Q4. Open Source codes in the community… • What is the  Importance and impact of open source / community code  applications such as CCSM, WRF,….? • Information assurance issues for DoD • Open source may be problematic for operations • Navy open source code can be useful • Physics (from other institutions, agencies) • Framework (Earth System Modeling Framework) • Post processing, graphics etc. • COAMPS code (has > 350 users) • Fosters collaboration, and leverages expertise (within and beyond CWO) among agencies, universities.

  8. Q5. Data and collaboration, formats, future needs… • What is the level of collaboration and standardization of data management, observational & results data bases: such as use  of common file formats, web based data, etc.  What is needed in the future? • Common file standards for exchange among agencies (grib2 for operations, some netcdf for research). • Static databases (land characteristics, etc.) are commonly shared, but often not standardized. • Standardized observational databases (common format with other agencies is being considered) • Future: • Much larger databases • Larger need for standardized output (input) for community shared projects (TIGGE, HFIP, etc.)

  9. Q6. Ensemble model: your experiences… • Has the use of ensemble model had any positive or negative impact in reducing the code scaling requirements? • Limiting factor is how well deterministic model scales • Ensembles are embarrassingly parallel and should perform well on large multi-core clusters. • Need efficient I/O to exchange state information between the model output and post processing (DA) • Ensemble approaches present some challenges for post processing (archival) and file management.

  10. Q7. Obligatory Question: (no pun intended!)Cloud computing: your views (unfiltered)… • What  is your current / future interest in grid or cloud computing ? • Grid / cloud computing may potential work well for ensembles, although there are obvious challenges (I/O) • Domain decomposition across the grid, could present big challenges • Models require huge input datasets and produce large output datasets (persistent data storage) • Model paradigm may have to be re-visited (communication, latency between nodes might not be consistent). • Information assurance could be an issue (particularly for DoD operations).

More Related