1 / 16

C luster of R esearch I nfrastructures for S ynergies in P hysics ( CRISP )

C luster of R esearch I nfrastructures for S ynergies in P hysics ( CRISP ).

donoma
Download Presentation

C luster of R esearch I nfrastructures for S ynergies in P hysics ( CRISP )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cluster of Research Infrastructures for Synergies in Physics (CRISP) Build up collaborations and create long-term synergies between research infrastructures on the ESFRI roadmap in the field of physics, astronomy and analytical facilities to facilitate their implementation and enhance their efficiency and attractiveness.

  2. Facts And Figures • 11 research infrastructures • 16 participating institutions • Project start: 1 October 2011 • Project duration: 3 years • 12 M€ funding from EC within FP7

  3. Infrastructures • European Synchrotron Radiation Facility (ESRF) • Facility for Antiproton and Ion Research (FAIR) • Institut Laue–Langevin (ILL) • Super Large Hadron Collider (SLHC) • SPIRAL2 • European Spallation Source (ESS) • European X-ray Free Electron Laser (XFEL) • Square Kilometre Array (SKA) • European Free Electron Lasers (EuroFEL) • Extreme Light Infrastructure (ELI) • International Liner Collider (ILC)

  4. Partners • European Synchrotron Radiation Facility (ESRF) • Deutsches Elektronen Synchrotron (DESY) • European Organisation for Nuclear Research (CERN) • European Spallation Source (ESS) • Grand Accélérateur National d'Ions Lourds (GANIL) • GSI Helmholtz Centre for Heavy Ion Research (GSI) • Institut Laue–Langevin (ILL) • European X-ray Free Electron Laser (XFEL) • The University of Rome La Sapienza • The Foundation for Research and Technology Hellas • The Instituto Superior Técnico (IST) • Istituto Nazionale di Fisica Nucleare (INFN) • MTA SZTAKI • The Horia Hulubei National Institute of Physics • and Nuclear Engineering (IFIN-HH) • The University of Oxford • The University of Cambridge • The Paul Scherrer Institut (PSI)

  5. Relationships

  6. Accelerators Instruments & Experiments IT & Data Management Detectors & Data Acquisition Development and implementation of common solutions Harmonisation, cost-efficiency and interoperability

  7. SKA Participation in Areas of Activity

  8. Dissemination and Industry related activities • Dissemination of the CRISP concepts and the results of the technical work to the CRISP participants, consortium members, the user communities, present and future industry partners, and science policy makers. • Organisation of topical workshops on industry related activities as a forum of exchange of experience. • T1 - Overall coordination and management of dissemination activities. • T2 - Establishment of a public web site, to be updated on a regular basis. • T3 - Trigger articles on the CRISP project in specialised media widely read by the scientific community and policy makers. • T4 - Participation in events organised by the (national) funding bodies of the CRISP consortium with focus on connections to industry (examples are Holland@CERN, Germany@ESO, STFC@ESRF/ILL, …) • T5 - Organisation of topical workshops for CRISP partners to exchange experience in relation to industry (procurement issues, technology transfer, ….). This will include establish links to the activities going on within other INFRA-2011 projects. • All participants have 0.5PM to engage their own internal activities with this WP

  9. SKA Participation in Areas of Activity

  10. Theme 3, Detectors and DAQ High throughput Detector Data Streaming CO2 Cooling Innovative solutions for neutron and γ-ray detectors Large area thermal neutron detectors using 10b films

  11. High-throughput detector data streaming • To select, define and implement techniques and methods to reduce, transmit and process high throughput data streams produced by advanced detectors. Simplifying the detector design, construction and deployment, as well as reducing costs; • by selecting and defining reusable hardware and software interfaces and components. • by investigating the use of FPGAs, GPUs and multi-core processors in flexible, user-friendly software/firmware stream-processing • by identifying and assessing emerging hardware standards for future DAQ applications • SKA participation through Oxford (Kris Adami), 36PM • T4 - Design and evaluate data rejection and compression schemes of images using FPGA and GPU processing hardware immediately after or on the detector head electronics or in a next downstream computing layer. • T5 - Extend the stream-processing frameworks implemented on SKA pathfinders to accept data inputs from XFEL detectors and evaluate 'plug-and-play' capabilities for pipeline processing. Build a prototype framework.

  12. SKA Participation in Areas of Activity

  13. Theme 4, IT and Data Management User Identity Metadata and Catalogues Distributed Data Infrastructure High-speed Data Recording

  14. High-speed Data Recording • High-speed Recording of Data • Data rates that exceed tens of GB/s • In some cases from multiple sources • To permanent storage and archive • Cost-effective method • Optimised and secured access • To data using standard protocols • SKA participation through Cambridge (Bojan Nikolic), 36PM • Investigation of technologies for local at experiment and ROC • High performance distributed filesystems

  15. Distributed Data Infrastructure • Support the expanding data management needs • Of the participating RIs • Analyse the existing distributed data infrastructures • From the network and technology perspective • Reuse if possible depending on previous requirements • Plan and experiment their evolution • Potential use of external providers • Understand the related policy issues • SKA participation through Oxford (David Wallom), 36PM • Investigating methodologies for data distribution and access at participating institute and national centres • Possibly build on the optimised LHC technologies (tier/P2P model)

  16. Summary • Three year project staring 1 Oct 2011 • Four Areas • Accelerators • Instruments and Experiments • Detectors and Data Acquisition • IT and Data Management • SKA significant interest; • One funded Sub Area within Detectors and DAQ • High-throughput detector data streaming (36PM) • Two funded Sub Areas within IT and DM • High-speed Data Recording (36PM) • Distributed Data Infrastructure (36PM)

More Related