1 / 9

Grid Applications for High Energy Physics and Interoperability

Centre de Calcul de l’IN2P3 et du DAPNIA. Grid Applications for High Energy Physics and Interoperability. Dominique Boutigny CC-IN2P3 June 24, 2006. IFIC. MSU. Lab m. Tier-3. Univ. x. RAL/UK. IC. BNL/US. USA Brookhaven. IFCA. Lab c. UK. Lab a. UB. LAPP. USA FermiLab. CNAF/IT.

fergal
Download Presentation

Grid Applications for High Energy Physics and Interoperability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Centre de Calcul de l’IN2P3 et du DAPNIA Grid Applications for High Energy Physics and Interoperability Dominique Boutigny CC-IN2P3 June 24, 2006

  2. IFIC MSU Lab m Tier-3 Univ. x RAL/UK IC BNL/US USA Brookhaven IFCA Lab c UK Lab a UB LAPP USA FermiLab CNAF/IT Tier-2 Univ. n France Cambridge Lab e France Lab d France Tier1 1 CERN GridKa/DE GRIF Japon CPPM Italy IN2P3-LPC CCIN2P3/FR Allemagne Hollande IN2P3-Subatech Tier-0 Lab b Lab c PIC/SP Labo X CSCS Tier-1 Univ. b Univ. y Taipei Desktop/ portables TRIUMF/CA Canada SARA/NL ASCC/Taipei Rome CIEMAT Krakow USC LHC Experiments  15 PByte of data / year A computing Grid architecture at the world level is mandatory to process them W-LCG Grid infrastructure

  3. Framework • Grid activities at IN2P3 are mainly related to LHC computing • Deeply involved in the EGEE project • Grid Operation  ~10 engineers involved at CC-IN2P3 • Network • The main goal is to setup a Tier-1 node at CC-IN2P3 for the worldwide grid W-LCG for physics oriented production • LHC will be starting in 2007, the Grid infrastructure should be up and running by that time. • Experiencing the Grid through Service and Data challenges • Ramping up, step by step, the data throughput and the number of jobs up to the LHC nominal value

  4. Network monitoring – Ouput from CERN 1.6 GB/s Network monitoring – Input to CC-IN2P3 250 MB/s Optical network connection to CERN : 10 Gbps

  5. W-LCG = EGEE (gLite) + OSG (+ Nordic) • Need to interoperate both Grids • At the job level • Some level of interoperability has been reached for LCG 2 / OSG interoperability • Still some work to do, especially for gLite / OSG interoperation. • Not clear how interoperation is working at the level of the data catalog • At the Operation level • Work is just starting • Crucial in the context of LHC computing

  6. IT / CC-IN2P3 relationships • ANR: 3 year project: CC-IN2P3 – LIP (RESO) – RENATER + FNAL (Chicago) • 2-fold: • OSG / EGEE interoperability (CC-IN2P3) • High bandwidth network transfers (LIP) • Understand data transfer patterns on a long distance network and with data from real applications • Optimize data transfers • Will get a 2×1 Gbps dedicated link between Chicago and CC-IN2P3 (RENATER) • Hope to have it upgraded up to 10 Gbps before the end of the project • Project started since 3-4 month – Quite some infrastructure to setup before actual work could begin

  7. CC-IN2P3 and W-LCG grid • Large infrastructure with operational support • Provide real applications involving a huge amount of data • Computer Science: • Develop Grid software from a more fundamental point of view • Are able to handle workflow, bottlenecks, data access, networks etc… without the pressure of HEP data to analyze • I am convinced that both world could benefit from each other and that next generation of HEP operational Grids will come from the IT world.

  8. France / Japan relationship • 2 computing projects within the Associated International Laboratory (AIL) • 1 focused on LHC Computing • 1 focused on interoperability between NAREGI and EGEE • Large overlap with NEGST • Close relationship between Tokyo University and CC-IN2P3 related to ATLAS experiment. • Wish to connect the Tokyo Tier-2 (ICEPP) to CC-IN2P3 Tier-1 • Data exchange between both sites • Strong interest to develop relationships in view of the future International Linear Colider project where France and Japan have common interest

  9. SRB • The Storage Resource Broker and its successor: RODS is also a subject of common interest (developed at SDSC) • SRB is a distributed file catalog with a sophisticated Metadata management system. • Data distribution / replication is very efficient (heavily used in the BaBar experiment between Stanford and Lyon) • SRB is not part of EGEE / OSG • KEK Japanese colleagues are working on SRB / EGEE interoperation • CC-IN2P3 is very interested to join this effort • Recent interoperability test between several SRB server federations located in Japan, New-Zealand, USA, France, Italy, UK, Australia. • Performances were very good and worldwide SRB configuration was especially easy to setup

More Related