1 / 16

Expedition Workshop

This workshop aims to explore efficient data management strategies for petascale and exascale processing, enabling access and discovery of ultra-large data collections. It will focus on the challenges and solutions in handling the expanding digital universe of data.

rfix
Download Presentation

Expedition Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Expedition Workshop Towards Scalable Data Management June 10, 2008 Chris Greer Director, NCO

  2. Networking and Information Technology Research and Development Program NITRD NCO National Coordination Office

  3. White House Executive Office of the President Office of Science and Technology Policy National Science and Technology Council NITRD Program Structure Committee on Technology National Coordination Office (NCO) NITRD Subcommittee High End Computing (HEC I&A - R&D) Cyber Security and Information Assurance High Confidence Software and Systems Social, Economic, and Workforce Human Computer Interaction and Information Management Software Design and Productivity Large Scale Networking

  4. White House Executive Office of the President Office of Science and Technology Policy National Science and Technology Council NITRD Program Structure Committee on Technology National Coordination Office (NCO) NITRD Subcommittee High End Computing (HEC I&A - R&D) Cyber Security and Information Assurance High Confidence Software and Systems Social, Economic, and Workforce Human Computer Interaction and Information Management Software Design and Productivity Large Scale Networking

  5. …effective, efficient petascale approaching exascale processing and throughput of ultra-large data collections … enabling access and discovery Workshop Focus:

  6. …effective, efficient petascale approaching exascale processing and throughput of ultra-large data collections … enabling access and discovery Workshop Focus:

  7. “In 2006, the amount of digital information created, captured, and replicated was 1,288 x 1018 bits (or 161 exabytes) … This is about 3 million times the information in all the books ever written” The Expanding Digital Universe IDC White Paper sponsored by EMC; March, 2007

  8. Information And Storage Transient information or unfilled demand for storage Petabytes Worldwide Information Available Storage Source: John Gantz, IDC Corporation The Expanding Digital Universe

  9. …effective, efficient petascale approaching exascale processing and throughput of ultra-large data collections … enabling access and discovery Workshop Focus:

  10. “Sometime in the 2010s, if all goes well, the Large Synoptic Survey Telescope (LSST) will start to bring a vision of the heavens to Earth. Suspended between its vast mirrors will be a three billion-pixel sensor array, which on a clear winter night will produce 30 terabytes of data. In less than a week this remarkable telescope will map the whole night sky …. And then the next week it will do the same again … building up a database of billions of objects and millions of billions of bytes.” Nature 440:383

  11. Large Hadron Collidor Physicists will use the LHC to recreate the conditions just after the Big Bang, by colliding two beams [of hadrons] head-on at very high energy. Source: public.web.cern.ch/Public/en/LHC/LHC-en.html

  12. When LHC begins operations, it will produce roughly 15 Petabytes of data annually, which thousands of scientists around the world will access and analyse … The mission of the LHC Computing Project (LCG) is to build and maintain a data storage and analysis infrastructure for the entire high energy physics community that will use the LHC. Source: public.web.cern.ch/Public/en/LHC/LHC-en.html

  13. Satellite Tobacco Mosaic Virus Structure Simulation P. L. Freddolino, A. S. Arkhipov, S. B. Larson, A. McPherson, K. Schulten. Structure, 14:437-449

  14. Contact: nco@nitrd.gov

More Related