1 / 16

CC-IN2P3: Overview of Astroparticle Computing Needs in France

This overview provides insights into CC-IN2P3's role in federating the computing needs of the French community in high energy physics, nuclear physics, and astroparticles. It includes information on computing resources, storage, and projects involved.

gmcdowell
Download Presentation

CC-IN2P3: Overview of Astroparticle Computing Needs in France

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Astroparticle @ CC-IN2P3: an overview Jean-Yves Nief

  2. dapnia What is CC-IN2P3 ? • Federate the computing needs of the french community for: • High energy physics. • Nuclear physics. • Astroparticles. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  3. Computing @ CC-IN2P3 • > 5000 Unix accounts. • ~ 20 groups for astroparticle. • Interactive machines (Linux): • Prepare tasks for batch system (compile) or interactive work (visualization, analysis …). • Batch farms using BQS (Linux): • 1 farm for independent tasks: • 5000 CPUs. • 11 millions SI2k (25.7 Teraflops). • 1 farm for // computing (MPI, PVM): • 294 CPUs. • Mutualized access to the farm. • Batch scheduling takes into account the group priority. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  4. Storage @ CC-IN2P3 • Storage media: • Disks, tapes. • Data access: • AFS. • NFS, GPFS. • HPSS (access to the Mass Storage System). • dCache, Xrootd, SRB. • ~ 550 TB so far for astroparticle (in HPSS). Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  5. Astroparticle @ CC-IN2P3 • Serving all the project where IN2P3 labs participate (list below not complete!). • Cosmic rays studies: • Pierre Auger observatory, Antares, AMS. • Gamma ray astronomy: • HESS, Glast. • CMB measurement: • Planck. • Gravitationnal waves observatory: • Virgo. • Supernovae studies: • SNLS, Supernovae Factory. • Dark matter search: • Eros. • Edelweiss. • Future involvements ?: • LSST. • Lisa. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  6. Astroparticle @ CC-IN2P3 • CPU usage in 2007: CPU usage since Jan. 1st: Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  7. Cosmic rays Pierre Auger Observatory • Data transferred from Argentina to Lyon. • Lyon: primary site for storage and distribution of real data and simulations. • Batch farm used for simulation and data analysis for foreign collaborators (non IN2P3). • NFS, SRB, HPSS. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  8. Cosmic rays Antares • Underwater neutrino detector closed to Toulon (-2500 m). • Real time database (calibrations…) hosted in Lyon. • Lyon: primary site for storage and distribution of real data and simulation (~ 300 TB/y ). • CC-IN2P3 used for data analysis and simulation production. • Xrootd, SRB Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  9. Gamma ray astronomy Hess • Local batch farm for data acquisition. • 2 copies of the data on tapes: Lyon and Heidelberg. • Weekly transfer to France (by carrier). • Biggest astroparticle consumer at CC-IN2P3 (overall: 3rd in 2006 and 7th in 2007): used for calibration, data reconstruction etc… • Xrootd, NFS, HPSS. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  10. Gamma ray astronomy Glast • Large computing needs for simulation production. • Main computing site: SLAC, Stanford. • CC-IN2P3 doing simulation production: jobs « submitted » remotely from SLAC to the production sites as Lyon (Java + RMI). • NFS, Xrootd (for data transfer). • Actual simulation: • ~ 10 TB to be produced. • 30000 jobs. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  11. CMB measurement Planck • CC-IN2P3 used for Planck HFI: • Simulation (main task). • Data analysis. • Read and write access much more challenging than HEP. • Ex: aggregated rate (r+w) for 60 tasks: 500 MB/s. • NFS totally inadequate, using GPFS instead. • Some usage of MPI. • Require more memory than others (up to 16 GB). • Hundreds of TBs of simulation produced in Lyon ? Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  12. Gravitationnal waves Virgo • Two main computing centres: Lyon + Bologna. • Data analysis in these two computing site (+ ~ 60 TB / y ). • Some usage of EGEE for simulations. • Analysis activities with data access not on the grid: plans with EGEE ? • Data distribution: now switching to SRB, will be used for Ligo transfers to Lyon, remote data visualization ? • Xrootd, NFS, SRB, HPSS. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  13. Supernovae studies SNLS et SNFactory • Telescope in Hawaï. • SNLS: Canada / France. • SNFactory: USA / France. • Small amount of data produced but needs large disk volumes for derived products (> 40 TB). • Lots of I/Os: e.g., 640 MB/s for 120 jobs. • SNFactory: usage of CC-IN2P3 for real time analysis. • GPFS, SRB. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  14. Future involvements ? LSST • Large Synoptic Survey Telescope: starting in 2014. • Real challenge for data access. • Derived data could reach the level of the EB (e.g: Supernovae studies) !! • CC-IN2P3 could be involved in the computing part of the project. • Participation to DC3 (Fall 2007): transfers, data replication (NCSA, SDSC, CC-IN2P3) using UDT, iRODS. Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  15. Prospects and challenges • Main challenge for present and future: • I/O rate for data analysis and processing. • Multi sites data management and preservation.  Data access and storage. • Middlewares usage: • EGEE: little usage so far. • SRB being used in production (hundreds of TB / y). • iRODS: interface with VOSpace in project (Observatoire de Strasbourg). Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

  16. Prospect and challenges Astroparticle @ CC-IN2P3 -- Aspera 08/02/08

More Related