1 / 27

Production Grids

Social Sciences. Production Grids. Computer and Information Sciences. Earth Sciences. Life Sciences. General Overview. Fabrizio Gagliardi Director, Technical Computing, Microsoft EMEA. Multidisciplinary Research. New Materials, Technologies and Processes. Math and Physical Science.

artie
Download Presentation

Production Grids

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Social Sciences Production Grids Computer andInformation Sciences Earth Sciences Life Sciences General Overview Fabrizio Gagliardi Director, Technical Computing, Microsoft EMEA MultidisciplinaryResearch New Materials,Technologiesand Processes Math andPhysical Science

  2. A few personal remarks Need for e-Infrastructures and Grids Background history Outlook to the future MS contribution Talk Outline

  3. GGF1 and EDG EGEE and GGF experience Acknowledgements Move to MS A few personal remarks

  4. Science, industry and commerce are more and more digital, process vast amounts of data and need massive computing power We live in a “flat” world: Science is more and more an international collaboration and often requires a multidisciplinary approach Need to use technology for the good cause Fight Digital/Divide Industrial uptake has become essential Need for e-Infrastructures

  5. Some examples (from EU-EGEE): Earth sciences applications • Satellite Observations: • ozone profiles • Solid Earth Physics • Fast Determination of mechanisms of important earthquakes • Hydrology • Management of water resources in Mediterranean area (SWIMED) • Geology • Geocluster: R&D initiative of the Compagnie Générale de Géophysique

  6. The SEE-GRID initiative SEEREN SEEREN > http://www.see-grid.org GGF16, Athens, 13 February 2006 6

  7. SEE-GRID e-Infrastructure Infrastructure use for regional applications Visualization of Medical diagnostic images, Earth sciences, Thermodynamics, Fluid dynamics, etc. Already deployed and used in University Children’s Hospital in Belgrade. Search Engine for South-East Europe (SE4SEE) Grid-based Web-crawling and domain-specific Web page filtering service From the original 21 organizations in from 11countries to 40 organizations in two years GGF16, Athens, 13 February 2006 7

  8. Drug Discovery: a new hope for developing countries • Grid-enabled drug discovery process • Reduce time required to develop drugs • Develop the next steps of the process (molecular dynamics) • Data challenge proposal for docking on malaria • Never done on a large scale production infrastructure • Never done for a neglected disease • Data challenge during the summer • 5 different structures of the most promising target • Output Data: 16,5 million results, ~10 TB • Added value • Facilitates inclusion of developing countries • Tool to enhance collaboration building • Facilitate distributed software development for complex integrated workflows

  9. The underlining infrastructure

  10. BioMed Grid • Infrastructure • ~2.000 CPUs • ~21 TB disks • in 12 countries • >50 users in 7 countries working with 12 applications • 18 research labs • ~80.000 jobs launched since 04/2004 • ~10 CPU years Number of jobs Month HP HPC Forum Divonne

  11. The EGEE Grid www.eu-egee.org

  12. The EU research network

  13. GRNET e-Infrastructure • Backbone Network and Athens Metropolitan Area Network (MAN) : 2,5 Gbps lambda (Packet over SDH-PoS) • Crete Regional Network and Major Institute Access: Gigabit Ethernet • Dark fibre acquired for 2 network spans • Major dark fibre tender for the whole backbone underway • HellasGridGrid infrastructure hosted by major Research & Academic Institutes in Athens (Demokritos, IASA, NDC), Crete (ICS-FORTH), Patra (CTI-CEID), Thessaloniki (AUTH-UoM): • ~800 CPUs (x86_64, 2 GB RAM, 80GB HDD, 2x Gbit) • ~30 TB total raw SAN storage capacity in all other HG nodes • ~60 TB Tape Library • 4 Access Grid nodes HP HPC Forum Divonne

  14. HellasGrid Production Infrastructure • Infrastructure and targeted projects funded by HellasGrid 2M Euro project • Operations, training, policies supported mainly by EC co-funded FP6 project EGEE • EGEE major catalyst to secure HellasGrid funding • Major institutes involved as Third Parties in the EGEE project • EuGridPMA accredited Certification Authority • Applications areas supported by local Ministry: Physics, Biomedical, Chemistry, other • Major call for applications integration under way ~ 0,5M Euro(might be upgraded to > 1M because of great interest shown) HP HPC Forum Divonne

  15. Many Grids around the world, very few maintained as a persistent infrastructure Need for public and open Grids (OSG, EGEE and related projects, NAREGI, and TERAGRID, DEISA good prototypes) Persistence, support, sustainability, long term funding, easy access are the major challenges Current situation: accomplishments and challenges

  16. Key Challenges • Security (see Blair Dillaway’s talk) • Stable accepted industrial standards (GGF and EGA converging) • Learning curve for applications • Complexity of running a Grid infrastructure across different administrative domains

  17. Meta-computing and distributed computing early examples in the 80’ and 90’ (CASA, I-Way, Unicore, Condor etc.) EU FP5 and US Trillium and national Grids EU FP6, US OSG, NAREGI/Japan… UK e-Science programme and similar other national programmes How did we get here?

  18. GriPhyN PPDG iVDGL Related Grid Projects An ecosystem of prototype Grid projects in US, EU and A/P in 2003

  19. Where are we going?

  20. Top 500 Supercomputer Trends Clusters over 50% Industry usage rising IA is winning GigE is gaining

  21. Supercomputing Goes Personal

  22. The Future: Supercomputing on a Chip • IBM Cell processor • 256 Gflops today • 4 node personal cluster => 1 Tflops • 32 node personal cluster => Top100 • MS Xbox • 3 custom PowerPCs + ATI graphics processor • 1 Tflops today • $300 • 8 node personal cluster => “Top100” for $2500 (ignoring all that you don’t get for $300) • Intel many-core chips • “100’s of cores on a chip in 2015” (Justin Rattner, Intel) • “4 cores”/Tflop => 25 Tflops/chip

  23. The Continuing Trend Towards Decentralized, Networked Resources Grids of personal & departmental clusters Personal workstations & departmental servers Minicomputers Mainframes

  24. MS HPC road map Computer Cluster Solution (CCS) V1: • Beowulf-style compute cluster seamlessly integrated into a Windows-based infrastructure, including security infrastructure (Kerberos and Active Directory); user jobs run under the Windows user credentials • ISVs support for most common software applications • Job scheduler accessible via command-line, COM and via a published Web services protocol.  Can be customized via sys admin-defined admission and release filters that run in the scheduler when a job is submitted and when it becomes scheduled as ready-to-run.  In V2 the scheduler will be made even more extensible. • Performance will be comparable to Linux • MPICH (incl. MPICH-2) supported  (open-source version from ANL) 

  25. MS HPC road map CCS areas of focus for V2: • Extend to forests of clusters and meta-schedulers • Storage and parallel I/O issues, as well as possibly simple workflow support • Development of new the tools

  26. Production Grids are a reality and here to stay HPC is becoming commodity seamlessly stretching from desktop to back-end resources MS HPC products are coming soon Microsoft is participating in major standardisation bodies MS is present at this meeting and prime sponsor Conclusions

  27. Wish everybody a fruitful and positive week in Athensyou can contact me at: fabrig@microsoft.com

More Related