1 / 20

Global Grid Efforts

Global Grid Efforts. Richard Cavanaugh University of Florida. US Grid3 NEESgrid TeraGrid AccessGrid Open Science Grid EU DataGrid/EGEE DataTAG LCG CrossGrid GridLab NorduGrid Asia Pacific PRAGMA ApGrid South America CHEPRO. Canadian West Grid UK GridPP German D-Grid

inge
Download Presentation

Global Grid Efforts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Global Grid Efforts Richard Cavanaugh University of Florida

  2. US Grid3 NEESgrid TeraGrid AccessGrid Open Science Grid EU DataGrid/EGEE DataTAG LCG CrossGrid GridLab NorduGrid Asia Pacific PRAGMA ApGrid South America CHEPRO Canadian West Grid UK GridPP German D-Grid Italian Grid Italy INFN Grid Korean K*Grid Australian grangenet Japanese NAREGI Grid Technology Research Center China CNGrid (some) Global Grid Efforts This talk will survey a sample of these efforts  diverse functionality + many active grids

  3. Perform tele-observation and tele-operation of experiments Publish to and make use of a curated data repository Access computational resources and open-source analytical tools Access collaborative tools for experiment planning, execution, analysis, and publication 10 Universities and Institutes across the US First trans-Pacific experiment carried out in March between US and Japan

  4. $98M Project General purpose computational facility ~20 Teraflops ~1 Petabyte networked disk storage

  5. Goal: An integrated U.S. Grid infrastructure Grid computing infrastructure to support US scientific efforts CPU & storage resources from laboratories and universities DOE and NSF partnership Internet2, ESNet, state, international optical networks Getting there: OSG-1 (Grid3), OSG-2, … Series of releases  increasing functionality & scale Initial meetings Sep. 17 @ NSF: Educators, scientists, etc. Jan. 12 @ Fermilab: Public discussion, planning sessions Next steps White paper to be expanded into roadmap Presentation to funding agencies (this Summer?)

  6. INFN GRID

  7. GridPP • 19 UK Universities, CCLRC (RAL & Daresbury) and CERN • Funded by the Particle Physics and Astronomy Research Council (PPARC) • GridPP1 - 2001-2004 £17m "From Web to Grid“ • GridPP2 - 2004-2007 £15m "From Prototype to Production" ScotGrid NorthGrid SouthGrid London Grid

  8. Project involves several Nordic universities and HPC centers Will continue for 3-4 years more Forms the ”North European Grid Federation” of the EGEE together with the Dutch Grid, Belgium and Estonia Will provide middleware for the ”Nordic Data Grid Facility” and related projects (SWEGRID, Danish Grid etc) Shares authentication and authorization mechanisms with EDG/LCG-1/EGEE

  9. European Project ( ~5 M€, 3 year project started March 2002 ) Polish (Cracow & Poznan) / Spanish (CSIC & CESGA) / German (FZK) Objectives: Extension of GRID in Europe, assuring interoperability with DataGrid Interactive Applications(“human in the loop”): Environmental fields (meteorology/air pollution, flooding crisis management) High Energy Physics (interactive analysis over distributed datasets) Medicine (vascular surgery preparation)

  10. LCG Institute Institute Institute Institute LHC Computing Grid CMS Experiment Online System 100-1500 MBytes/s • Grid for the Large Hadron Collider • Particle Physics Experiment • 4 Scientific Collaborations: • 1000s of scientists • 100s of institutes • 10s of countries CERN Computer Center > 20 TIPS Tier 0 • LCG-2 currently covers: • 22 Countries • 62 Sites (48 Europe, 2 US, 5 Canada, 6 Asia, 1 HP) • 4000 CPUs 10-40 Gbps Tier 1 Korea UK Russia USA 2.5-10 Gbps Tier 2 Tier2 Center Tier2 Center Tier2 Center Tier2 Center 1-2.5 Gbps Tier 3 Physics cache 1-10 Gbps Tier 4 PCs

  11. Develop a service grid infrastructure in Europe Brings together 70 organisations 27 countries Three core areas: build a consistent, robust and secure grid network continuously improve and maintain the middleware attract new users from industry as well as science and ensure high standard of support Two pilot application domains Large Hadron Collider Biomedicine INFN GRID NORDUGRID

  12. 13 resource providers consisting of 7 supercomputers and 9 high-performance clusters Heterogeneous computing architectures including Linux, AIX, HP-UX and IRIX Based on Globus Toolkit 2.4 and MPICH-G2 1.2.5 Support application scientists to adapt the Grid environment Provide production CA service based on APGrid PMA(Policy Management Authority) Collaborate other international Grid communities such as PRAGMA, IHPC, GridLab Application Domains High Energy Physics Biotechnology Nanotechnology Environment Technology Space Technology KOREA

  13. grangenet • 3 Year Program • Install, Develop, Operate Multi-gigabit Network • Participate in Global Grid Efforts • North America • Asia-Pacific • Europe • Support Grid Services • Distributed Computing • Collaborative Visualisation • Cooperative Environments • Digital Libraries • Application Domains • Computational Physics • Bioinformatics • Astronomy • Computational Engineering • On-line Health • Environmental modelling AUSTRALIA

  14. China National Grid Project • Chinese Gov’t • Investing 12 M USD • Partnering with IBM • Expected to be 6 Teraflops • Eventually increasing to 15 Teraflops • Middleware based on Open Grid Services Architecture • Initial Application Domains • Remote learning • Bioinformatics

  15. PRAGMA and Ap Grid • Pacific Rim Application and Grid Middleware Assembly • NSF funded • Establish sustained collaboration with Asia-Pacific Grid efforts • Asia Pacific Grid • 15 Countries, 49 Organisations • Not funded by any single entity • Not application dedicated • General platform • Open community for Grid Researchers in AP

  16. An Inter-Regional Center for High Energy Physics Research and Educational Outreach (CHEPREO) at Florida International University • E/O Center in Miami area • iVDGL Grid Activities • CMS Research • AMPATH network (S. America)

  17. Running on Global Grids Aim for common interfaces and interoperability Slide taken from Juergen Knobloch

  18. Conclusion • I have grossly omitted the Global Grid Forum… • The “Grid” as a concept is being Globally adopted • Multitude of national and international initiatives • There are several sizable grids which are continuously active • O(10) production level Grids world-wide representing O(10 000) CPUs • Diverse functionality • High-throughput computing • Interactive visualisation • Indication that grid technology is maturing • Interoperability and common interfaces are real issues facing Global Grid activity • Sociology is at least as important as technology • Standards bodies, like the GGF, are becoming increasingly important

More Related