1 / 1

southgrid.ac.uk

eshana
Download Presentation

southgrid.ac.uk

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The room has a 500mm deep false floor, which uses heavy duty anti static floor tiles. The cold air from the AC units will pass up through ventilated tiles in front of the racks. The racks will be arranged in a hot aisle cold aisle configuration, the hot air from the rear of the rack passes by convection back to the top of the AC units. Each of the 21 rack positions are supplied with two 32 Amp Commando sockets for power and 4 CAT6 network sockets. www.southgrid.ac.uk University of Oxford, Particle Physics Grid involvement 2000 - 2008 Early beginnings at OxfordFollowing attendance at RAL 21-23rd June 2000 course by Ian Fosters Globus team. Initial installations on grid test systems used globus 1.1.3 (Aug - Oct 2000)Various old desktop systems formed the early testbed. Two 100KW power distribution panels supply the sockets under the racks. Each of the three rows (of 7 racks) are on a different phase. A third panel supplies the AC units. The external chillers are seen below. The first production quality hardware was purchased in 2004, two racks of Dell servers, which provided 80 2.8GHZ Xeon cpus ( This is approx 65KSI2000 ) with 3.2TB of storage. In addition a 30 cpu cluster for the CDF SAM Grid, with ~9TB of storage was also provided. The room was ready by 10th September. Lack of Electrical power and more importantly sufficient Air Conditioning prevented planned upgrades in 2005 and 2006. Plans were put into place to have a large computer room built to be shared with the Oxford Super Computer at Begbroke Science Park. This was a large project not scheduled to be completed till late 2007. Local Physics infrastructure for windows systems and departmental clusters were also in great need of an updated room. Existing systems housed in converted offices were at the limits, and office space is at a premium. Other systems were in overheated and dirty areas. The upgrade to the Tier 2 Grid cluster was installed during the following week. The new worker nodes are a high density design that houses two servers in a 1U chassis. The shared Power Supply is more electrically efficient than two separate units giving a lower cost of ownership. The storage servers provide 9TB of useable storage (after RAID 6) per unit. This substantial upgrade provides a six fold increase in CPU power and 34 times storage increase. (431 KSI2000 and 100TB storage) So in 2007 approval for a local Physics Infrastructure computer room to be built on level 1 of the Denys Wilkinson building was given. In April 2007 an area was cleared on level1. In June the first real work of dry lining the walls and installing the Air Conditioning units started. The complete cluster old and new racks are shown below. SouthGrid Management Board Chair: Jeff Tseng Technical Co-ordinator: Pete Gronbech September 2007

More Related