1 / 10

LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00

LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00. Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources 2001 2002 2003

tyson
Download Presentation

LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LHCb(UK) Computing/Grid:RAL PerspectiveGlenn Patrick 08.06.00 Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources2001 2002 2003 Processors (PC99-450MHz) 830 1670 3100 Disk (TB) 25 50 125 Tape (TB) 67 130 330 Glenn Patrick (g.n.patrick@rl.ac.uk)

  2. What exists now?RAL-CSF: Main LHCb Platform • Currently, 160*P450 equivalent processors. • Hope to expand to ~300*P450 in September. • Linux Redhat 6.1 being phased in on all machines • (HPs being shut down) to give compatibility with CERN (eg. lxplus). • PBS (Portable Batch System) not NQS. • 1TB+ of robotic tape space for LHCb. • 500GB+ of disk space for LHCb (need to request). • Globus toolkit v1.1.1 installed on front-end (with testbed service on another machine).

  3. HP BATCH LINUX BATCH HP LINUX SUN FDDI NIS DataStore userids Disk Farm n TB Scratch /home AFS RAL Particle Physics Unix Services 100 Megabit Switched Network DataStore

  4. LHCb Software • LHCb software stored in 4GB AFS project space /afs/rl.ac.uk/lhcb • Updated just after midnight every night. • CMT/CVS installed (although no remote updating to CERN repository). • Crude LHCb environment at the moment, but managed to process events through SICBMC with little knowledge of LHCb software. • Available for LHCb to exploit for detector, physics & Grid(?) studies.

  5. MC Production: RAL NT Farm • 18*450MHz PII + 9*200MHz Pentium Pro • LHCb frontend in addition to dual-cpu frontend. • Production capacity 100k-200k events/week. • 500k bb events processed so far and stored in RAL DataStore. • Events now transferred over network to CERN using RAL VTP protocol instead of DLTs. • Thanks to Dave Salmon,Eric van H & Chris Brew. • Latest production code being installed (DS).

  6. RAL NT FarmNew Front-end & extra batch nodes LAN & WAN 18GB DAT 100Mb/sswitch CPU 2 3 9 + 14 14 Front End PDC Batch Node 4 10 15 File Server 4 + 4 GB Peripherals 5 11 16 New Systems 1 6 12 17 BDC BDC 8 7 13

  7. Grid Developments There is now a “CLRC Team” for the particle physics grid + several work groups (GNP represents LHCb with CAJB also a member). • Important that this is beneficial for LHCb. • EU (DataGrid) application to distribute 107 events & 3TB using MAP/RAL/... does not start production until 2002. • Need to start now and acquire some practical experience and expertise  decide way forward.

  8. Grid Developments II Meetings: 14th June(RAL) Small technical group to discuss short term LHCb aims, testbeds, etc. (CERN,RAL,Liverpool,Glasgow…) 21st June(RAL) Globus Toolkit User Tutorial 22nd June(RAL) Globus Toolkit Developer Tutorial Open to all, register at... http://www.globus.org/news/uk-registration.html 23rd June(RAL) Globus “strategy” meeting. (invitation/nomination)

  9. Desktop users Which Grid Topology for LHCb(UK)? Flexibility important. CERN Tier 0 INFN IN2P3 etc…. Tier 1 RAL etc…. etc…. Tier 2 etc…. Liverpool Glasgow Edinburgh    Department

  10. Grid Issues • Starting to be asked for estimates of LHCb resources (central storage, etc) and Grid requirements for applications and testbeds. • Useful to have a LHCb(UK) forum for discussion & feedback  define model for all UK institutes, not just RAL. • Any documentation (including this talk) on computing/software/Grid at... • http://hepwww.rl.ac.uk/lhcb/computing/comphome.html

More Related