1 / 23

The LHC Computing Grid

The LHC Computing Grid. Visit of Dr. Robert Zimmer President, University of Chicago Chairman of the Fermi Research Alliance. Frédéric Hemmer Deputy Head, IT Department 20 August 2007. The LHC Data Challenge. The accelerator will be completed in 2008 and run for 10-15 years

gaetano
Download Presentation

The LHC Computing Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The LHC Computing Grid Visit of Dr. Robert Zimmer President, University of Chicago Chairman of the Fermi Research Alliance Frédéric HemmerDeputy Head, IT Department 20 August 2007 Wolfgang von Rüden, CERN, IT Department

  2. The LHC Data Challenge • The accelerator will be completed in 2008 and run for 10-15 years • Experiments will produce about15 Million Gigabytes of data each year (about 20 million CDs!) • LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors • Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity

  3. Solution: the Grid • Use the Grid to unite computing resources of particle physics institutes around the world • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe

  4. How does the Grid work? • It makes multiple computer centres look like a single system to the end-user • Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. • Middleware balances the load on different resources.It also handles security, accounting, monitoring and much more.

  5. View of the ATLAS detector (under construction) 150 million sensors deliver data … … 40 million times per second

  6. LHC Computing Grid project (LCG) • More than 140 computing centres • 12 large centres for primary data management: CERN (Tier-0) and eleven Tier-1s • 38 federations of smaller Tier-2 centres • ~35 countries involved

  7. WLCG Collaboration • The Collaboration • 4 LHC experiments • ~140 computing centres • 12 large centres (Tier-0, Tier-1) • 38 federations of smaller “Tier-2” centres • ~35 countries • Memorandum of Understanding • Agreed in October 2005, now being signed • Resources • Contributed by the countries participating in the experiments • Commitment made each October for the coming year • 5-year forward look • Relies on EGEE and OSG (and other regional efforts)

  8. OSG & WLCG • OSG Infrastructure is a core piece of the WLCG. • OSG delivers accountable resources and cycles for LHC experiment production and analysis. • OSG Federates with other infrastructures. • Experiments see a seamless global computing facility The LHC Computing Grid – July 2006

  9. Use of OSG by HEP in June-July Routinely providing > 150K hours of computing per day to HEP Source: OSG The LHC Computing Grid – July 2006

  10. CMS Data Transfer 100 TB/Day Source: OSG The LHC Computing Grid – July 2006

  11. SAM tests site availability for CMS CMS Site Availability Monitoring (SAM) tests whether the services crucial for CMS jobs are operating at each site. It provides an early indicator of job success rate for each site. Sites are tested each hour. Now 58 sites tested… 80% Source: CMS

  12. 4x 6x Ramp-up to First Physics Run Evolution of installed capacity from April 06 to June 07 Target capacity from MoU pledges for 2007 (due July07) and 2008 (due April 08)

  13. Impact of the LHC Computing Grid in Europe • LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) • EGEE is now a global effort, and the largest Grid infrastructure worldwide • Co-funded by the European Commission (Cost: ~130 M€ over 4 years, funded by EU ~70M€) • EGEE already used for >20 applications, including… Bio-informatics Education, Training Medical Imaging

  14. The EGEE project • EGEE • Started in April 2004, now in 2nd phase with 91 partners in 32 countries • 3rd phrase (2008-2010) in preparation • Objectives • Large-scale, production-quality grid infrastructure for e-Science • Attracting new resources and users from industry as well asscience • Maintain and further improve“gLite” Grid middleware

  15. Applications on EGEE Summary of applications report: https://edms.cern.ch/document/722132 • More than 25 applications from anincreasing number of domains • Astrophysics • Computational Chemistry • Earth Sciences • Financial Simulation • Fusion • Geophysics • High Energy Physics • Life Sciences • Multimedia • Material Sciences • …..

  16. Example: EGEE Attacks Avian Flu • EGEE used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. • 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks in April - the equivalent of 100 years on a single computer. • Potential drug compounds now being identified and ranked. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica.

  17. Example:Geocluster industrial application • The first industrial application successfully running on EGEE • Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries. • EGEE technology helps CGG to federate its computing resources around the globe.

  18. Sustainability • Need to prepare for permanent Grid infrastructure • Ensure a high quality of service for all user communities • Independent of short project funding cycles • Infrastructure managed in collaboration with National Grid Initiatives (NGIs) • European Grid Initiative (EGI)

  19. For more information about the Grid: www.gridcafe.org www.eu-egee.org www.cern.ch/lcg Thank you for your kind attention! CERN – June 2007

More Related