1 / 14

Visualization Support for XSEDE and Blue Waters

Visualization Support for XSEDE and Blue Waters. DOE Graphics Forum 2014. Organization. Advanced Digital Services. Advanced Digital Services. Common Blue Waters and XSEDE functions User support (including visualization) Network infrastructure Storage Operations

tabib
Download Presentation

Visualization Support for XSEDE and Blue Waters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Visualization Support for XSEDE and Blue Waters DOE Graphics Forum 2014

  2. Organization Advanced Digital Services

  3. Advanced Digital Services • Common Blue Waters and XSEDE functions • User support (including visualization) • Network infrastructure • Storage • Operations • Support for future NCSA efforts • Resources managed by service level agreements

  4. Advanced Digital Services • Visualization Group – Support for data analysis and visualization. • XSEDE • Dave Bock • Mark VanMoer • Blue Waters • Dave Semeraro • Rob Sisneros

  5. Visualization Support • Software • Paraview • VisIt • YT • IDL (coming soon) • (ncview, matplotlib, ffmpeg, …) • Opengl driver on Blue Waters XK nodes • Direct user support • Data analysis / custom rendering. • Scaling analysis tools / parallel IO

  6. XSEDE

  7. Blue Waters Stellar magnetic and temperature field Supernova ignition bubble Atmospheric downburst

  8. Blue Waters Compute System. • System • Total Peak Performance 13.34 PF • Total System Memory 1.476 PB • XE Bulldozer Cores* 362,240 • XK Bulldozer Cores* (CPU) 33,792 • XK Kepler Accelerators (GPU) 4,224 • Interconnect • Architecture 3D Torus • Topology 24x24x24 • Compute nodes per Gemini 2 • Storage • 26.4 PB • Bandwidth > 1 TB/sec

  9. Blue Waters Visualization System. • System • Total Peak Performance 13.34 PF • Total System Memory 1.476 PB • XE Bulldozer Cores* 362,240 • XK Bulldozer Cores* (CPU) 33,792 • XK Kepler Accelerators (GPU) 4,224 • Interconnect • Architecture 3D Torus • Topology 24x24x24 • Compute nodes per Gemini 2 • Storage • 26.4 PB • Bandwidth > 1 TB/sec

  10. Blue Waters Allocations • GLCPC – 2% • PRAC – over 80% • Illinois – 7% • Education – 1% • Project Innovation and Exploration • Industry

  11. GLCPCGreat Lakes Consortium for Petascale Computing GLCPC Mission: “…facilitate and coordinate multi-institutional efforts to advance computational and computer science engineering, and technology research and education and their applications to outstanding problems of regional or national importance…” • 2% allocation • 501c3 Organization* • 28 Charter members** • Executive Committee • Allocations Committee • Education Committee *State 501c3 filing complete, federal filing in progress ** 28 Charter members represent over 80 universities, national laboratories, and other education agencies

  12. Industry S&E Teams • Industry can participate in the NSF PRAC process • 5+% allocation can dedicated to industrial use • Specialized support by the NCSA Private Sector Program (PSP) staff • Blue Waters staff will support the PSP staff as needed • Potential to provide specialized services within Service Level Agreements parameters • E.g. throughput, extra expertise, specialized storage provisions, etc. High interest shared by partner companies in the following: • Scaling capability of a well-known and validated CFD code • Temporal and transient modeling techniques and understanding. Two example cases under discussion: • NASA OVERFLOW at scale for CFD flows • Temporal modeling techniques using the freezing of H2O molecules as a use case and as a reason to conduct both large-scale single runs and to gain significant insight by reducing uncertainty.

  13. Real Improvement in Time to Solution • 10,5603 grid Inertial confinement fusion (ICF) calculation with multifluid PPM • Rendered 13,688 frames at 2048x1080 pixelsv4 panels per view & 2 views per stereo image @ 4096x2160 pixels. Stereo movie is 1711 frames

More Related