1 / 10

Southgrid Technical Meeting

Southgrid Technical Meeting. Pete Gronbech: 16 th March 2006 Birmingham. Present. Pete Gronbech – Oxford Rosario Esposito - Oxford Chris Brew – RAL PPD Yves Coppens – Birmingham Winnie Lacesso - Bristol. Agenda. 10:30 Start Pete + Others 12pm Lunch Interactive Workshop!!

tobit
Download Presentation

Southgrid Technical Meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Southgrid Technical Meeting Pete Gronbech: 16th March 2006 Birmingham

  2. Present • Pete Gronbech – Oxford • Rosario Esposito - Oxford • Chris Brew – RAL PPD • Yves Coppens – Birmingham • Winnie Lacesso - Bristol

  3. Agenda • 10:30 Start • Pete + Others • 12pm Lunch • Interactive Workshop!! • 3:15pm Coffee ?? • 4:20pm Finish

  4. Agenda Topics • LCG-2_7_0 • Experiences from Yves & Chris • Plans for Oxford and Cambridge • Monitoring • Network Monitoring Box • Ganglia at Bristol once webserver ready. • Ganglia mods for VO’s Help again from Chris? • Nagios anyone? • Pakiti • aid • tripwire?? • swatch /ranger • SC4 • T2 workshop Who is going? • Throughput tests –Bristol and Cam repeat tests at end of Month • Network connectivity at Bristol/ Cambridge at 1Gbps ?? Next week Bristol, Cam ?? • UI FTS client works out of the box. • Storage security challenge?? Do we know which logs to look at or even are the SRM’s doing enough logging. • Re security Challenge a Best practice how to should now be made available on the wiki….

  5. Agenda Topics • ALICE paying for machine to act as VO box at Birmingham. Possibly also at other SouthGrid sites. Security. Root access. policy?? Yves will test first … • Unified Naming Scheme? UKI-SOUTHGRID-OXF ?? Oxford to try it!! • Cambridge progress with APEL / Condor? • future upgrades • VO support Can all SouthGrid sites support the same VO’s? With lcg 2_7_0 vo tool available. • On clusters with various memory allocations have to advertise memory available per job slot not per machine!! • Backups: • DPM database on se. • ce • se • mon • lfc • Central Logging Machine also useful as a secondary backup of logs • VRVS demo

  6. LCG 2_7_0 • Birmingham SL304 problems with mktmp so upgrading to sl305 first helps. • Check SL mirror is still OK. • Bham other problems • info provider on ce (extra info from maui only worked for default maui setup, as we have a customized config ) • Recent (March 7th) Bristol Upgrade was much smoother. Many bugs fixed • SouthGrid Now using more modules eg for ganglia • For new nodes use pbsnodes –o fqdn then when happy use pbsnodes –c fqdn • Plan for Oxford Next week and Camb shortly after.

  7. SL 3 – 4- 5 • Summary of Talk given at CERN wrt to SL versions. • SL5 will be to late for LHC so push to certify SL4 by end of March and migrate in Autumn 2006 • There will be no OS upgrade planned for 2007!

  8. Future Upgrades • RAL PPD (£250K) • 52 dual core dual opteron wn’s (late March 06) • 7 * 8TB sata disk servers (2 sys, 1 parity, 1 Hot spare, 20 data) • Network: separate 1Gb/s fibre. separate from rest of PPD. Nortell switches (8) 10Gb capable. so should be able to have a 10Gb link to RAL T1. • CPU over MoU can be used for T3 local PPD VO

  9. Future Upgrades • Oxford • Still waiting for our new computer room…..now Autumn 06 • Short term air con upgrade to allow us to stay as we are! • Bristol • Uni cluster to go on 5th floor of physics. • New room ?? June?? seems un realistic • Need to know how to send LCG jobs to external clusters – We should ask LT2 how they did escience centre.

  10. Future Upgrades • Birmingham • Atlas farm may get integrated more but… • Babar farm increased nodes but h/w reliability problems. Some new disks purchased and may buy some replacement PSU’s • babar ce to be migrated to gridpp ui box. • Lawrie applying for esci clusters….

More Related