1 / 30

The experience with LCG GRID in Russia

The experience with LCG GRID in Russia. E.Lyublev, A.Selivanov, B.Zagreev ITEP, Moscow November 3, 2005. ITEP history. was founded on December 1, 1945. the heavy-water reactor was run in 1949

moe
Download Presentation

The experience with LCG GRID in Russia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The experience with LCG GRID in Russia E.Lyublev, A.Selivanov, B.Zagreev ITEP, Moscow November 3, 2005

  2. ITEP history • was founded on December 1, 1945. • the heavy-water reactor was run in 1949 • in 1961 the 7-GeV proton synchrotron started operating.It was the first Russian proton accelerator using the strong focusing principle. • Now ITEP is Russian scientific centers aimed at studying nuclear physics and physics of elementary particles. • Institute occupies the area of the old eighteenth century estate "Cheremushki". ITEP&GRID

  3. ITEP Alikhanov Institute for Theoretical and Experimental Physics Russian Federation State Scientific Center ITEP&GRID

  4. ITEP in winter ITEP&GRID

  5. Research program Particle and nuclear physics: Theoretical study Experimental research: • @ITEP accelerator • @CERN, FNAL, DESY,KEK and other international centers • 2B decay (Ge, Mo, Xe …) ITEP&GRID

  6. Research program • Low energy physics and chemistry • Accelerator techniques • Nuclear power facilities • Medical physics Details: www.itep.ru ITEP&GRID

  7. International collaboration • DESY (Hamburg) – ARGUS, H1, HERA-B • CERN (Geneva) – AMS, CHORUS, L3, ATLAS, ALICE, CMS, LHCb • FNAL (Batavia) – D0, E781(SELEX) • GSI (Darmstadt) - CBM ITEP&GRID

  8. Russia participation in EGEE/LCG RDIG Russian Data Intensive Grid ITEP&GRID

  9. RDIG PNPI JINR KIAM ITEP SINP RRC KI IHEP IMPB ITEP&GRID

  10. ITEP EGEE/LCG production cluster ITEP&GRID

  11. ITEP EGEE/LCG production cluster ITEP&GRID

  12. ITEP EGEE/LCG hardware • UI – user interface • CE – computing element • SE – storage element • WNs – nodes for batch system • Mon – RGMa server • VO Box – server of Virtual Organization • RDIG user support server • LFC – LCG File Catalog ITEP&GRID

  13. ITEP LCG parameters • OS –SLC 3.05 • MW – LCG -2.6.0-9 • Bach system –PBS with maui • WNs – P4(HT) -2.4Ghz 1Gb 80Gb • SE • Classic SE -1Tb • dCache/srm – 4Tb ITEP&GRID

  14. Network • ITEP Network Backbone • 1Gb Ethernet • ITEP LAN • 100Mb Ethernet • Wireless • WAN • 1Gb channel (RAS) • 100Mb channel (MSU) ITEP&GRID

  15. Application SW • Alice – Alien 2_4 (VO Box) + AliRoot, ROOT, xrootd … • Atlas - VO-atlas-release-10.0.4 • CMS - OSCAR_3_6_5 ORCA_8_7_1 CMKIN_4_4_0_dar • LHCb- Gaudi-v15r5 DaVinci-v12r11 ITEP&GRID

  16. Monitoring & statistics • GOC • Gridice • MonAlisa • Farm statistics • Network statistics ITEP&GRID

  17. GridIce ITEP&GRID

  18. RDIG Monitoring ITEP&GRID

  19. RDIG User Support ITEP&GRID

  20. ALICE DC04 statistics ITEP&GRID

  21. DC04 Summary • About 7000 jobs have been successfully done at AliEn Russian sites in 2004. It is ~4% from the total Alice statistics. Job efficiency is about 75% • Quite visible participation in ALICE and LHCb Data Challenges ITEP part ~70% SE ~1.7 TByte ITEP&GRID

  22. DC05 to be continued… ITEP&GRID

  23. Timeline of PDC05/SC3 2005 Aug Sep Oct Nov Dec Prototype data analysis (Phase 3) ALICE data ‘push’: - reserved/shared bandwidth - test of FTS (Phase 2) Job submission through LCG interface Event production (Phase 1) SC3 – start of service phase ITEP&GRID

  24. Participating in Alice SC3 All experiment specific SW ITEP&GRID

  25. AliEn (AliCE Environment) • The AliEn framework has been developed as the ALICE userentry point into the Grid world, shielding the users from its underlying complexity and heterogeneity.Through interfaces, it can use transparently resources of different Grids (LCG and INFN Grids). In the future, the cross-Grid functionality will be extended to cover other Grid flavours. • The system is built around Open Source components and uses aWeb Services model and standardnetwork protocols. Less than 5% is native AliEn code (PERL). • None of other Grid flavours providesa complete solution for the ALICE computing model. All these Grids provide a different user interface and a diverse spectrum of functionality. • Therefore some of the AliEn services will continue to be used as the ALICE's single point of entryto the computing resources of other Grid and as a complement of their functionality. The foreign Grid will be accessed via interfaces. ITEP&GRID

  26. LCG CE LCG CE LCG CE LCG SE/SRM LCG SE/SRM LCG SE/SRM AliEn services structure Central services: • Catalogue • Task queue • Job optimization -etc. File registration AliEn CE/SE Job submission LCG UI AliEn CE/SE AliEn CE/SE LCG RB ITEP&GRID

  27. ALICE interface to LCG • Through a VO-Box, provided on the site • LCG UI full mapping • AliEn services (Cluster Monitor, CE, SES, MonALISA, PackMan, xrootd) • VO-Box requirements published: https://uimon.cern.ch/twiki/pub/LCG/ALICEResourcesAndPlans/alice_vobox_requirements.doc ITEP&GRID

  28. ITEP LCG site as Tier2 in SC3 (ALICE) • LCG 2.6 • FTS client • SE Dcache with srm • LFC • Xrootd protocol • Alien 2_4 connectivity with TIER1 centers is an issue ! ITEP&GRID

  29. Non LHC GRID activity • Russian VO PHOTON for SELEX colleagues was organized in 2005 • Regional centre for AMS (VO in preparation) • Collaboration with CBM project (GSI, Darmstadt) • ITEP theory department is very interested ITEP&GRID

  30. Summary & plans • Ready for PDC05/SC3 • Further support of LHC experiments Data Challenges becomes a trivial task running automatically • Increase significantly the power of ITEP farm in 2006 current installation occupies only ~5% of infrastructure • Concentrate on distributed analysis connectivity with TIER1 centers is an issue ! ITEP&GRID

More Related