1 / 8

Birmingham site report

Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007. Birmingham site report. Desktops. 60 user-desktop PCs running Linux Older user-desktops are Pentium 3.2 GHz, 1-GByte RAM, running Scientific Linux 3.

Download Presentation

Birmingham site report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007 Birmingham site report

  2. Desktops • 60 user-desktop PCs running Linux • Older user-desktops are Pentium 3.2 GHz, 1-GByte RAM, running Scientific Linux 3. • Newer desktops are Athlon X2 dual-core, 2-GByte RAM, running Scientific Linux 4. • Migrating all to SL4 on user request • 4 user-desktops running Windows XP • 12 PCs in labs running whatever experiment requires: Windows or Linux

  3. Farm hardware • Alice Farm: 18 dual 800 MHz PC boxes • BaBar Farm: 120 dual 800 MHz blades • Atlas Farm: 38 dual 2.0 GHz blades • 50% of Atlas Farm and 4% of BaBar Farm is for our local use on SL3 • 50% Atlas and 90% BaBar farm on grid • 6% of BaBar farm running a Pre-Production Service (PPS)

  4. Laptops • 14 laptops in a group Pool, running Windows XP and SL3/4 dual-boot • ~10 user laptops (mainly students) • All laptops are behind an extra level of ‘firewall’ to the outside world

  5. Servers • Linux servers running various systems: SL3, SL4, SL4_64, and CentOS 5 / SL5. • Most recent file-server running CentOS 5, with 16 TB raid, split as 2 filesystems. • Citrix Windows Terminal Server(s) for those required MS applications.

  6. Networking • Gigabit to the newer servers, 100Mb/s to desktop (but gig interfaces on most PCs) • 2 Gbits/s for the dept to rest of campus • Campus firewall adopting default-deny policy, but Physics has its own firewall • Connection on campus to UKLight, in testing by campus staff

  7. Grid • UK Front-ends hosting CE, MON and SE plus CEs for BaBar farm and PPS service • Storage increased to 10 TB earlier this year • All nodes run on SL305, gLite 3 update 24 • pxe-boot/kickstart installation using yum and yaim • + some steps to finish off configuration

  8. On-Campus clusters • Existing eScience cluster with a batch system of 38 dual nodes (nearly 3 yr old) • New BEAR cluster installing this week (May 2007): initial phase 256 nodes, each with two dual-core processors, 60 TB data. • By final phase: 2048 cores, 100 TB data. • A share for us; plan for gridPP too.

More Related