1 / 19

Grid Development @ Glasgow

Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID EDG TestBed 1 Status. Middleware Overview of SAM Spitfire - Security Mechanism Optor – replica optimiser simulation Monitoring Prototype Hardware Software

Download Presentation

Grid Development @ Glasgow

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID EDG TestBed 1 Status Middleware Overview of SAM Spitfire - Security Mechanism Optor – replica optimiser simulation Monitoring Prototype Hardware Software People Summary Grid Development @ Glasgow

  2. Grid Team System Middleware Applications Hardware Software Hardware

  3. LHC Computing at a Glance • The investment in LHC computing will be massive • LHC Review estimated 240MCHF • 80MCHF/y afterwards • These facilities will be distributed • Political as well as sociological and practical reasons Europe: 267 institutes, 4603 users Elsewhere: 208 institutes, 1632 users

  4. Starting Point

  5. Starting Point “Current technology would not be able to scale data to such an extent, which is where the teams at Glasgow and Edinburgh Universities come in. … It is hoped that the computing technology developed during the project will have wider applications in the future, with possible uses in astronomy, computing science and genomics observation, as well as providing generic technology and software for the next generation Internet.”

  6. ScotGRID++ ~1 TIPS LHC Computing Challenge 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ~PBytes/sec Online System ~100 MBytes/sec Offline Farm~20 TIPS • One bunch crossing per 25 ns • 100 triggers per second • Each event is ~1 Mbyte ~100 MBytes/sec Tier 0 CERN Computer Centre >20 TIPS ~ Gbits/sec or Air Freight HPSS Tier 1 RAL Regional Centre US Regional Centre Italian Regional Centre French Regional Centre HPSS HPSS HPSS HPSS Tier 2 Tier2 Centre ~1 TIPS Tier2 Centre ~1 TIPS Tier2 Centre ~1 TIPS Tier 3 ~Gbits/sec Physicists work on analysis “channels” Glasgow has ~10 physicists working on one or more channels Data for these channels is cached by the Glasgow server Institute ~0.25TIPS Institute Institute Institute Physics data cache 100 - 1000 Mbits/sec Tier 4 Workstations

  7. CPU Intensive Applications Numerically intensive simulations: • Minimal input and output data • ATLAS Monte Carlo (gg H bb) 182 sec/3.5 Mb event on 1000 MHz linux box Compiler Tests: Compiler Speed (MFlops) Fortran (g77) 27 C (gcc) 43 Java (jdk) 41 Standalone physics applications: 1. Simulation of neutron/photon/electron interactions for 3D detector design 2. NLO QCD physics simulation

  8. Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 2002 2003 2004 2005 Timeline Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event Store available for general users applications Distributed production using grid services Full Persistency Framework Distributed end-user interactive analysis LHC Global Grid TDR grid “50% prototype” (LCG-3) available ScotGRID ~ 250 CPUs + ~ 50 TBytes LCG-1 reliability and performance targets First Global Grid Service (LCG-1) available

  9. ScotGRID

  10. ScotGRID • ScotGRID Processing nodes at Glasgow • 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory • 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet • 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and 100 + 1000 Mbit/s ethernet • 1TB disk • LTO/Ultrium Tape Library • Cisco ethernet switches • ScotGRID Storage at Edinburgh • IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM • 70 x 73.4 GB IBM FC Hot-Swap HDD • Griddev testrig at Glasgow • 4 x 233 MHz Pentium II • BaBar UltraGrid System at Edinburgh • 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory • Fast Ethernet and MirrorNet switching • CDF equipment at Glasgow • 8 x 700 MHz Xeon IBM xSeries 370 4 GB memory 1 TB disk

  11. EDG TestBed 1 Status Web interface showing status of (~400) servers at testbed 1 sites GRID extend to all expts

  12. Glasgow within the Grid

  13. Overview of SAM SAM

  14. HTTP + SSLRequest + client certificate Is certificate signedby a trusted CA? Has certificatebeen revoked? No No Yes Finddefault Role ok? Request a connection ID Spitfire - Security Mechanism Servlet Container SSLServletSocketFactory RDBMS Trusted CAs TrustManager Revoked Certsrepository Security Servlet ConnectionPool Authorization Module Does user specify role? Role repository Translator Servlet Role Connectionmappings Map role to connection id

  15. Optor – replica optimiser simulation • Simulate prototype Grid • Input site policies and experiment data files. • Introduce replication algorithm: • Files are always replicated to the local storage. • If necessary oldest files are deleted. • Even a basic replication algorithm significantly reduces network trafficand program running times. New economics-based algorithms under investigation

  16. Prototypes real world... simulated World… Tools: Java Analysis Studio over TCP/IP Instantaneous CPU Usage Scalable Architecture Individual Node Info.

  17. NeSC Opening - ScotGRID Hardware Software

  18. Glasgow Investment in Computing Infrastructure • Long tradition • Significant Dept. Investment • £100,000 refurbishment (just completed) • Long term commitment (LHC era ~ 15 years) • Strong System Management Team – underpinning role • New Grid Data Management Group – fundamental to Grid Development • ATLAS/CDF/LHCb software • Alliances with Glasgow Computing Science, Edinburgh, IBM.

  19. Grids are (already) becoming a reality Mutual InterestScotGRIDExample Glasgow emphasis on DataGrid Core Development Grid Data Management CERN+UK lead Multidisciplinary Approach University + Regional Basis Applications ATLAS, CDF, LHCb Large distributed databases a common problem=challenge CDF LHC Genes Proteins Detector forLHCb experiment Detector for ALICE experiment Summary(to be updated..) ScotGRID

More Related