1 / 12

Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU)

Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU). TIER2 of LCG (LHC Computing Grid). Outline. Mission of the centre General infrastructure (electricity, cooling, network) Capacities and services People and financing Conclusion. Mission of the centre.

benito
Download Presentation

Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) M. Lokajicek

  2. Outline • Mission of the centre • General infrastructure (electricity, cooling, network) • Capacities and services • People and financing • Conclusion

  3. Mission of the centre • Who we are? • Regional Computing Centre for Particle PhysicsInstitute of Physics of the Academy of Sciences of the Czech Republic, Prague • Basic research in particle physics, solid state physics and optics • What are we doing? • Computing support for big international Particle Physics, Nuclear Physics and Astro-particle Physics experiments using grid environment • D0, ATLAS, ALICE, STAR, AUGER, Belle (CESNET) • WLCG TIER2 centre • Solid State Physics computing • From the computing point of view: High Throughput Computing (HPC), large data samples processing, chaotic data analysis (by physicists), parallel computing • Our users? • Collaborating scientists from institutes of the Academy of Sciences of the Czech Republic, Charles University and Czech Technical University and others • Big experiments (grid environment), individual members of the international experiments, local physicists • CESNET (Czech Research Network Provider) • NGI (National Grid Initiative) of EGI (European Grid Initiative) • Delivers special point-to-point computer lines

  4. Computing Centre Infrastructure • Network connections • 1 Gb/s to European GEANT2 • 10 Gb/s to CESNET • CESNET – optical lines E2E • FNAL, TAIPEI, GRIDKA FZK Karlsruhe, BNL • FZÚ, UK, ČVUT, ÚJF Řež • Total electric power • For computers 200 kVAUPS enginegenerator 380 kVA • Air cooling 2x55 kW • Water cooling in 20092 units STULZ, 2x88 kVATotal 290 kVA(N+1)

  5. Contribution to Tier2s inATLAS + ALICE • http://www3.egee.cesga.es/gridsite/accounting/CESGA/country_view.html • Long term slide from 2006 for unavailable regular financing • From 2008 regular financing from grants (substantial capacity increase in all other Tier2s)

  6. RAW Capacities in FZU

  7. Center for Particle Physics in Prague • Delivering substantial capacities toD0 – second largest contribution to the experiment from external laboratories,ATLAS 2%, ALICE 7% (very good numbers), and others experiments • AUGER VO created in Prague • Support for Tier3 centers at Prague institutes and Universities • Today’s capacity23 kHEPSPEC2006, 2 800 cores 1,7 PB disc space • LCG capacities 250 000 cores, 100 PB disk,>140 centers worldwide -> Prague belongs to bigger centers AUGER 2800 jobs Running jobs D0 ATLAS ALICE 2011

  8. D0 Monte Carlo production and physics analysis • D0 MC 2010 • MC simulations in external laboratories • FZU 486M events, 2nd largest after IN2P3, 1300 concurrent jobs (2M jobs/year), 55 TB sent to FNAL • Physics analysis • PHD Theses – done in Prague

  9. Observation of a Centrality-Dependent Dijet Asymmetry in Lead-Lead Collisions at sNN= 2.76 TeVwith the ATLAS Detector at the LHC • Colleagues from Charles University (2 of 4 main authors) made the computing at our farm.

  10. Operation, Financing • Operation • Centre is operated by SAVT (computing department of FZU) • Together with other computing services for FZU • Our services run by 3-4 FTE • Contact person from each experiment in close contact with operation staff • Financing • FZU • Construction, power infrastructure (UPS, Generator), cooling by FZU, 20 MCZK in 2003/4 • Operation, electricity • HEP research grants • Computing capacities • Regular financing from 2008 (MoU with LCG)6 MCZK per year

  11. Conclusion • Long term activity starting in 2001 with European GRID project EDG • Most of personnel are it-professionals • Successful operation for experiments • Institute support for farm construction and operation • Grants for computing capacities • Visible Grid and non-grid computing services for HEP experiments and Czech collaborating institutes and physicists • Substantial support to our physicists and students in analysis competition

  12. 10:00 Introductions/Welcome • Prague Regional Computing Centre for Particle Physics • 10:15-10:45 Jan Švec: FZU Computing Centre • 10:45-12:30 RogerGoff: • HPC Technology • Future X86-64 Processor Technology • Green HPC • Co-processor Technology • Storage Technology (SSDs, LHC filesystemsolutions) • 12:30-13:00 Visit ofthe Centre

More Related