computer room requirements for high density rack mounted servers
Skip this Video
Download Presentation
Computer Room Requirements for High Density Rack Mounted Servers

Loading in 2 Seconds...

play fullscreen
1 / 16

Computer Room Requirements for High Density Rack Mounted Servers - PowerPoint PPT Presentation

  • Uploaded on

Computer Room Requirements for High Density Rack Mounted Servers Rhys Newman Oxford University Outline Why do we need computer rooms? Why in the past. Why in the future. Design of the environment. Cooling Humidity Power Proposal at Oxford Physics Conclusion Why do we need them (Past)

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Computer Room Requirements for High Density Rack Mounted Servers' - jacob

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
  • Why do we need computer rooms?
    • Why in the past.
    • Why in the future.
  • Design of the environment.
    • Cooling
    • Humidity
    • Power
  • Proposal at Oxford Physics
  • Conclusion
why do we need them past
Why do we need them (Past)
  • Security
    • Equipment is valuable.
  • Convenience
    • Specialist Knowledge is needed to look after them.
    • Networking was relatively difficult.
  • Bulk
    • A single (useful) installation was large
why we need them future
Why we need them (future)
  • Specialist Environmental Requirements
    • High density implies more sensitive.
  • Convenience
    • Human time cost of software maintenance.

Will be needed for the immediate future, but the Grid will reduce the need long term.

cooling then
Cooling - Then
  • Rack mounting designed to get high CPU density – optimise space usage given the effort needed to allocate secure facility.
    • Until recently, maximum power usage was about 2-3kw per rack.
    • Air cooling sufficient, cool air taken directly from under the floor.
    • Even conventional air conditioning on the ceiling was often enough.
cooling now too much success
Cooling Now: too much Success!
  • Modern 1U servers are 300W heaters => 12KW per rack (18KW for blade servers).
  • Rule of thumb: 1000 litres/sec of cool air can handle 12KW.
    • In detail a Dell 1750 uses 1200 l/min.
  • For 40 racks, this is 32000 l/sec which in a typical 600mm duct is a wind speed of 320km/hr!
cooling solutions
Cooling - Solutions
  • Focus on airflow!
    • Place racks in rows – hot aisle, cold aisle.
    • Leave doors off the racks.
    • Identify hotspots statically, or dynamically (HP smart cooling).
  • Rule of thumb: air cooling can manage 1200W/m2
major problem no bang for buck
Major Problem – no bang for buck
  • As the processor speeds increase =>
  • They get hotter =>
  • Fewer can exist per sqr metre =>
  • Overall CPU power in datacentre goes DOWN.

All this irrespective of how well you design the air cooling systems!

cooling solution ii
Cooling Solution II
  • Try self contained systems.
  • Try water cooled units (self contained or otherwise).
  • Use “smarter” systems which actively manage hotspots. HP smart cooling claims to get up to 2.5KW/m2 in this way (??).
  • Computers (in a datacentre) have tighter tolerances than humans – 45%-55% (despite manufacturer limits of 8%-80%).
    • Too low, risks static eletricity (fans in the computers themselves cause this).
    • Too high, localised condensation, corrosion and electrical short. Note: Zinc in floor tiles!
  • Air conditioning units must be better than for normal offices – how many rooms use conventional units?

No magic bullet of simply importing external air and venting it to the outside!!!

  • All this heat comes from the power supply
    • 1.2A per server
    • 50A per rack
    • 4000A for a 40 rack centre
  • And for the cooling systems, a total of 5000A => 1.25 MW.
summary so far
Summary so far….
  • Modern machines need a well designed physical environment to get the most out of them. Most current facilities are no longer well suited (a recent thing).
    • Intel scrapped 2 chip lines to concentrate on lower power chips, rather than simply faster.
    • Sun (and others) are working on chips with multiple cores and lower clock speeds (good for internet servers, not so good for physics!).
  • The cost of the surrounding room is a substantial cost of the entire facility.
example 40 racks for oxford
Example: 40 Racks for Oxford
  • We have an ideal location
    • Lots of power
    • Underground (no heat from the sun and very secure).
    • Lots of headroom (false floor/ceiling for cooling systems)
    • Basement
      • no floor loading limit
      • Does not use up office space.
bottom line
Bottom Line
  • The very basic estimate for the room, given the shell, is £80k.
  • Adding fully loaded cooling, UPS, power conditioning, fire protection etc will probably take this to £400k over time.
  • Cost of 40 racks ~ £1.6 million
  • Infrastructure costs: 25% of setup and up to 50% of running costs.
hang on
Hang on!
  • There are about 50000 computers already in Oxford university alone.
  • Assume 20000 are OK.
  • Already have a major data centre, with essentially no infrastructure problems!
  • The problem is software – the Grid will exploit these resources and thereby save millions in datacentre costs – medium term!
thank you
Thank you!
  • Sun has a detailed paper at:
  • APC has a number of useful white papers: