1 / 16

The Worldwide LHC Computing Grid

The Worldwide LHC Computing Grid. Visit of Spanish Royal Academy of Sciences. Dr Ian Bird LCG Project Leader 15 th April 2009. The LHC Data Challenge. Once the accelerator is completed it will run for 10-15 years

belle
Download Presentation

The Worldwide LHC Computing Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Worldwide LHC Computing Grid Visit of Spanish Royal Academy of Sciences Dr Ian Bird LCG Project Leader15th April 2009

  2. The LHC Data Challenge • Once the accelerator is completed it will run for 10-15 years • Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) • LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors • Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity Ian Bird, CERN, IT Department

  3. Solution: the Grid • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe Use the Grid to unite computing resources of particle physics institutions around the world Ian Bird, CERN, IT Department

  4. How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources.It also handles security, accounting, monitoring and much more. Ian Bird, CERN, IT Department

  5. View of the ATLAS detector (under construction) 150 million sensors deliver data … … 40 million times per second Ian Bird, CERN, IT Department

  6. Tier 0 at CERN: Acquisition, First pass reconstruction,Storage & Distribution 1.25 GB/sec (ions) Ian.Bird@cern.ch

  7. LHC Computing Grid project (LCG) PIC • More than 140 computing centres • 12 large centres for primary data management: CERN (Tier-0) and eleven Tier-1s • 38 federations of smaller Tier-2 centres • 7 Tier 2 in Spain: supporting ATLAS, CMS, LHCb • 35 countries involved Ian Bird, CERN, IT Department Ian.Bird@cern.ch

  8. Data Transfer • Data distribution from CERN to Tier-1 sites • Target rates easily exceeded for extended periods • For all experiments and to all Tier 1 sites • Also between Tier 1/Tier 2 sites

  9. Grid activity Tier 0 + Tier 1 sites Tier 2 sites • Workload continues to increase • At the scale needed for physics • Distribution of work across Tier0/Tier1/Tier 2 really illustrates the importance of the grid system • Tier 2 contribution is around 50%; >85% is external to CERN

  10. First events

  11. Impact of the LHC Computing Grid in Europe Bio-informatics Education, Training Medical Imaging LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€) EGEE already used for >100 applications, including… Ian Bird, CERN, IT Department

  12. EGEE-III Main Objectives Expand/optimise existing EGEE infrastructure, include more resources and user communities Prepare migration from a project-based model to a sustainable federated infrastructure based on National Grid Initiatives Flagship Grid infrastructure project co-funded by the European Commission Duration: 2 years Consortium: ~140 organisations across 33 countries EC co-funding: 32Million € The EGEE project - Bob Jones - EGEE'08 - 22 September 2008

  13. EGEE Achievements - Infrastructure EGEE Production Grid Infrastructure Steady growth over the lifetime of the project Improved reliability How can we reduce the effort required to operate this expanding infrastructure? The EGEE project - Bob Jones - EGEE'08 - 22 September 2008

  14. EGEE Achievements - Applications • >270 VOs from several scientific domains • Astronomy & Astrophysics • Civil Protection • Computational Chemistry • Comp. Fluid Dynamics • Computer Science/Tools • Condensed Matter Physics • Earth Sciences • Fusion • High Energy Physics • Life Sciences • Further applications under evaluation • How do we match the expectations of the growing user communities? • Will we have enough computing resources to satisfy their needs? Applications have moved from testing to routine and daily usage ~80-95% efficiency The EGEE project - Bob Jones - EGEE'08 - 22 September 2008

  15. Archeology • Astronomy • Astrophysics • Civil Protection • Comp. Chemistry • Earth Sciences • Finance • Fusion • Geophysics • High Energy Physics • Life Sciences • Multimedia • Material Sciences • … EGEE-III Partners in Spain: CESGA, CSIC, UNIZAR, UCM IFAE/PIC, CIEMAT, UPV, RED.ES >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day

  16. Sustainability Need to prepare for permanent Grid infrastructure Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI) Ian Bird, CERN, IT Department

More Related