Computer room requirements for high density rack mounted servers
Download
1 / 16

Computer Room Requirements for High Density Rack Mounted Servers - PowerPoint PPT Presentation

Computer Room Requirements for High Density Rack Mounted Servers Rhys Newman Oxford University Outline Why do we need computer rooms? Why in the past. Why in the future. Design of the environment. Cooling Humidity Power Proposal at Oxford Physics Conclusion Why do we need them (Past)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha

Download Presentationdownload

Computer Room Requirements for High Density Rack Mounted Servers

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Computer room requirements for high density rack mounted servers l.jpg

Computer Room Requirements for High Density Rack Mounted Servers

Rhys Newman

Oxford University


Outline l.jpg

Outline

  • Why do we need computer rooms?

    • Why in the past.

    • Why in the future.

  • Design of the environment.

    • Cooling

    • Humidity

    • Power

  • Proposal at Oxford Physics

  • Conclusion


Why do we need them past l.jpg

Why do we need them (Past)

  • Security

    • Equipment is valuable.

  • Convenience

    • Specialist Knowledge is needed to look after them.

    • Networking was relatively difficult.

  • Bulk

    • A single (useful) installation was large


Why we need them future l.jpg

Why we need them (future)

  • Specialist Environmental Requirements

    • High density implies more sensitive.

  • Convenience

    • Human time cost of software maintenance.

Will be needed for the immediate future, but the Grid will reduce the need long term.


Cooling then l.jpg

Cooling - Then

  • Rack mounting designed to get high CPU density – optimise space usage given the effort needed to allocate secure facility.

    • Until recently, maximum power usage was about 2-3kw per rack.

    • Air cooling sufficient, cool air taken directly from under the floor.

    • Even conventional air conditioning on the ceiling was often enough.


Cooling now too much success l.jpg

Cooling Now: too much Success!

  • Modern 1U servers are 300W heaters => 12KW per rack (18KW for blade servers).

  • Rule of thumb: 1000 litres/sec of cool air can handle 12KW.

    • In detail a Dell 1750 uses 1200 l/min.

  • For 40 racks, this is 32000 l/sec which in a typical 600mm duct is a wind speed of 320km/hr!


Cooling solutions l.jpg

Cooling - Solutions

  • Focus on airflow!

    • Place racks in rows – hot aisle, cold aisle.

    • Leave doors off the racks.

    • Identify hotspots statically, or dynamically (HP smart cooling).

  • Rule of thumb: air cooling can manage 1200W/m2


Major problem no bang for buck l.jpg

Major Problem – no bang for buck

  • As the processor speeds increase =>

  • They get hotter =>

  • Fewer can exist per sqr metre =>

  • Overall CPU power in datacentre goes DOWN.

All this irrespective of how well you design the air cooling systems!


Cooling solution ii l.jpg

Cooling Solution II

  • Try self contained systems.

  • Try water cooled units (self contained or otherwise).

  • Use “smarter” systems which actively manage hotspots. HP smart cooling claims to get up to 2.5KW/m2 in this way (??).


Humidity l.jpg

Humidity

  • Computers (in a datacentre) have tighter tolerances than humans – 45%-55% (despite manufacturer limits of 8%-80%).

    • Too low, risks static eletricity (fans in the computers themselves cause this).

    • Too high, localised condensation, corrosion and electrical short. Note: Zinc in floor tiles!

  • Air conditioning units must be better than for normal offices – how many rooms use conventional units?

No magic bullet of simply importing external air and venting it to the outside!!!


Power l.jpg

Power

  • All this heat comes from the power supply

    • 1.2A per server

    • 50A per rack

    • 4000A for a 40 rack centre

  • And for the cooling systems, a total of 5000A => 1.25 MW.


Summary so far l.jpg

Summary so far….

  • Modern machines need a well designed physical environment to get the most out of them. Most current facilities are no longer well suited (a recent thing).

    • Intel scrapped 2 chip lines to concentrate on lower power chips, rather than simply faster.

    • Sun (and others) are working on chips with multiple cores and lower clock speeds (good for internet servers, not so good for physics!).

  • The cost of the surrounding room is a substantial cost of the entire facility.


Example 40 racks for oxford l.jpg

Example: 40 Racks for Oxford

  • We have an ideal location

    • Lots of power

    • Underground (no heat from the sun and very secure).

    • Lots of headroom (false floor/ceiling for cooling systems)

    • Basement

      • no floor loading limit

      • Does not use up office space.


Bottom line l.jpg

Bottom Line

  • The very basic estimate for the room, given the shell, is £80k.

  • Adding fully loaded cooling, UPS, power conditioning, fire protection etc will probably take this to £400k over time.

  • Cost of 40 racks ~ £1.6 million

  • Infrastructure costs: 25% of setup and up to 50% of running costs.


Hang on l.jpg

Hang on!

  • There are about 50000 computers already in Oxford university alone.

  • Assume 20000 are OK.

  • Already have a major data centre, with essentially no infrastructure problems!

  • The problem is software – the Grid will exploit these resources and thereby save millions in datacentre costs – medium term!


Thank you l.jpg

Thank you!

  • Sun has a detailed paper at: http://www.sun.com/servers/white-papers/dc-planning-guide.pdf

  • APC has a number of useful white papers: http://www.apc.com/tools/mytools/


ad
  • Login