1 / 22

The LHC Computing Grid

The LHC Computing Grid. Visit of Mr Yasushi Furukawa Saga Prefectural Governor Japan. Frédéric Hemmer Deputy Head, IT Department 17 June 2008. The LHC Data Challenge. The accelerator will be completed in 2008 and run for 10-15 years

Download Presentation

The LHC Computing Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The LHC Computing Grid Visit of Mr Yasushi Furukawa Saga Prefectural Governor Japan Frédéric Hemmer Deputy Head, IT Department 17 June 2008

  2. The LHC Data Challenge • The accelerator will be completed in 2008 and run for 10-15 years • Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) • LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors • Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity Frédéric Hemmer, CERN, IT Department

  3. Solution: the Grid • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe Use the Grid to unite computing resources of particle physics institutes around the world Frédéric Hemmer, CERN, IT Department

  4. How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources.It also handles security, accounting, monitoring and much more. Frédéric Hemmer, CERN, IT Department

  5. View of the ATLAS detector (under construction) 150 million sensors deliver data … … 40 million times per second Frédéric Hemmer, CERN, IT Department

  6. Frédéric Hemmer, CERN, IT Department

  7. Frédéric Hemmer, CERN, IT Department

  8. Frédéric Hemmer, CERN, IT Department

  9. LHC Computing Grid project (LCG) Tier-2 in Japan: ICEPP Tokyo University More than 140 computing centres 12 large centres for primary data management: CERN (Tier-0) and eleven Tier-1s 38 federations of smaller Tier-2 centres 35 countries involved Frédéric Hemmer, CERN, IT Department

  10. LCG Service Hierarchy • Tier-1: “online” to the data acquisition process  high availability • Managed Mass Storage – grid-enabled data service • Data-heavy analysis • National, regional support Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany – Forschunszentrum Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) • Tier-0: the accelerator centre • Data acquisition & initial processing • Long-term data curation • Distribution of data  Tier-1 centres • Tier-2: ~140 centres in ~35 countries • Simulation • End-user analysis – batch and interactive Frédéric Hemmer, CERN, IT Department

  11. WLCG Grid Activity in 2007 • WLCG ran ~ 44 million jobs in 2007 – workload has continued to increase • Distribution of work across Tier0 / Tier1 / Tier 2 really illustrates the importance of the grid system • Tier 2 contribution is around 50%; > 85% is external to CERN • Data distribution from CERN to Tier-1 sites • Latest test in February show that the data rates required for LHC start-up have been reached and can be sustained over long periods Frédéric Hemmer, CERN, IT Department

  12. Evolution of job rate Workload is now at ~ 230k jobs/day, likely to reach total of ~100 M jobs in 2008 230k /day Frédéric Hemmer, CERN, IT Department

  13. Tier 0/Tier 1 Site reliability • Target: • Sites 91% & 93% from December • 8 best: 93% and 95% from December

  14. Tier 2 Reliabilities • Reliabilities published regularly since October • In February 47 sites had > 90% reliability • For the Tier 2 sites reporting: • For Tier 2 sites not reporting, 12 are in top 20 for CPU delivered Jan 08

  15. Impact of the LHC Computing Grid in Europe Bio-informatics Education, Training Medical Imaging LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~130 M€ over 4 years, funded by EU ~70M€) EGEE already used for >20 applications, including… Frédéric Hemmer, CERN, IT Department

  16. The EGEE project EGEE-III Beneficiary in Japan:Inter-University Research Institute Corporation, High Energy Accelerator Research Organization (KEK) • EGEE • Started in April 2004, now in third phase (2008-2010) • Brings together more than 240 institutions in 45 countries world-wide • Objectives • Large-scale, production-quality grid infrastructure for e-Science • Attracting new resources and users from industry as well asscience • Maintain and further improve“gLite” Grid middleware Frédéric Hemmer, CERN, IT Department

  17. Applications on EGEE Summary of applications report: https://edms.cern.ch/document/722132 • More than 25 applications from anincreasing number of domains • Astrophysics • Computational Chemistry • Earth Sciences • Financial Simulation • Fusion • Geophysics • High Energy Physics • Life Sciences • Multimedia • Material Sciences • ….. Frédéric Hemmer, CERN, IT Department

  18. Increasing workloads Still expect factor 5 increase for LHC experiments over this year Non-LHC usage now 32% Frédéric Hemmer, CERN, IT Department 18

  19. Example: EGEE Attacks Avian Flu • EGEE used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. • 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks in April - the equivalent of 100 years on a single computer. • Potential drug compounds now being identified and ranked. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica. Frédéric Hemmer, CERN, IT Department

  20. Example:Geocluster industrial application • The first industrial application successfully running on EGEE • Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries • EGEE technology helps CGG to federate its computing resources around the globe Frédéric Hemmer, CERN, IT Department

  21. Sustainability Need to prepare for permanent Grid infrastructure Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI) Frédéric Hemmer, CERN, IT Department

  22. www.cern.ch/lcg www.gridcafe.org www.eu-egi.org/ www.eu-egee.org For more information about the Grid: Thank you for your kind attention! Frédéric Hemmer, CERN, IT Department

More Related