1 / 19

Imperial College Site Report

This report outlines the equipment and software used at Imperial College for their High Energy Physics (HEP) program, including Unix servers and desktops, Linux PC farms, Windows servers and desktops, and the Solaris Group server. It also discusses their activities, such as MC production and grid developments, as well as current issues related to server and desktop maintenance, security, and software preferences.

kstevens
Download Presentation

Imperial College Site Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Imperial College Site Report HEP Sysman meeting 28 April 2003 Imperial College

  2. Outline • Equipment • what we have • Software • what we use • Activities • what we do • Problems • all of the above Imperial College

  3. Equipment • Unix Servers & Desktops • 2 x (Sun E450 servers, solaris 8, ~ 1Tbyte) • one JREI funded BaBar resource, one general • looking old and expensive now compared to… • Linux PC based servers and desktops • 2 PC based raid servers, ~ 1 Tbyte each • 4 dual processor, 2GHz, rack mounted, RH 7.3 for general interactive/batch use. • Various individual Linux desktops (some Grid). Imperial College

  4. Equipment • Linux PC farms • BaBar Linux PC farm, from JREI. • Main analysis facility for IC BaBar group. 2 dual CPU masters, 40 dual CPU workers. PBS batch system, ~300 Gbytes per master. • CMS/Grid PC farm • 5 master nodes, 440 Gbytes disk each. 40 worker nodes. All are 1GHz dual PIII with 1GB of RAM per CPU. • LeSC grid resources Imperial College

  5. Equipment • Windows Servers and Desktops • Windows 2000 server & backup server • group W2K domain accounts • profiles, home directories, experiment areas • domain printer queues • W2K and XP desktop PCs (~ 70 machines) • current default desktop environment • MS Office, windows ssh, Exceed, … • some PCs with specialist software, e.g. CAD. Imperial College

  6. Solaris Group Server • Sun Enterprise E450 running Solaris 2.8 • Three 400MHz processors • Two network interfaces • 100 Mbit/s to original subnet • 1 Gbit/s to farm subnet • ~ 1Tbyte disk, of which 800Mbytes Raid • Web, email, user accounts,... Imperial College

  7. Software • Unix: Solaris 2.8, Redhat Linux 7.3 • no user software supported on Solaris. • the usual Linux s/w + whatever experiment specific software we need. • Linux version is tied to experiments. • Windows server and desktops • College deal provides standard MS Office products for licensed windows PCs. Imperial College

  8. Activities • HEP programme • BaBar, CMS, DØ, LHCb, Zeus, dark matter, neutrino factory, detector development • Considerable MC production for the experimental programmes • Grid developments • see separate slides… • Desktop Office applications Imperial College

  9. LHCb MC production Imperial College

  10. DØ MC production Imperial College

  11. Grid Developments • We are a testbed node • with CE, 8 WN (dual 1GHz PIII ) and 1 SE with ~440GB • We run a resource broker (RB) • used as one of the 4 production RBs (others at CERN, Lyon, CNAF). • It is also the GridPP and BaBar RB Imperial College

  12. Grid Developments • We took part in the CMS Grid Stress test before Christmas. • We run a production quality (?) SAM station which automatically delivers the data required by our DØ members. • Have gridified (part) of the BaBar Farm (80 800 MHz PIII). Imperial College

  13. e-Science (not Grid) • We are now making heavy use of Viking at LeSC • (132 Dual 2GHz Xeon nodes currently... new procurement currently underway and another in ~6 months). We also use the HSM hosted by Saturn (24 processor E6800...6TB of Disk 24TB tape space). • Have found issues with time outs as data transferred from tape. Imperial College

  14. Current Issues: Server • Suns are getting old and not cost-effective. • BaBar Sun is out of warranty, Group server will be next year. • Maintenance cost on the RAIDs for the Suns is too expensive and the disks are expensive. • CPU maint. only, assign some part of RAID as spares. Imperial College

  15. Current Issues: Desktops • Do we stick with Windows for the standard desktop ? • College policy is for Windows Desktop • they also want complete control over all aspects of S/W installed, PC purchase, networking, “standardised desktop”. • Increasingly users want Linux desktop • especially Grid developers • many have only infrequent need for Windows. Imperial College

  16. Current Issues: Desktops • Dual booting is unattractive • experimenting with Terminal Server software for “occasional” Windows users. • Seems to work well, but need to clarify licensing situation. Probably OK for us. • Have considered providing laptops • many people are using laptops already as their default desktop machine, advantages when travelling, e.g. on LTA. Imperial College

  17. Current Issues: Security • College firewall • moving to default “deny all” policy this year. • maybe even ssh blocked unless registered. • Already causing some problems with recent blocks on all ports > 1026 • ftp call backs etc. a problem for Kerberised ftp to FNAL; needs PASV mode. Tough for emacs. • Some problems for grid apps. Imperial College

  18. Current Issues: Security • Some Grid developments are clashing with needs for secure systems. • edg software still needs obsolete RH 6.2 • Most of our Grid developers are really ex-HEP RAs, not SW professionals. • We need to make sure they are not cutting corners on security and compromising the rest of our systems for expediency. • Of course they all want root access. Imperial College

  19. Imperial College

More Related