1 / 0

One Small Step Toward a Green Datacenter

One Small Step Toward a Green Datacenter. Quan Nguyen <quan.nguyen@mcgill.ca> Network and Communications Services McGill University. Today’s Talk. Power consumption trends in the datacenter Our cooling problem Our solution The result Lessons learned:

dandre
Download Presentation

One Small Step Toward a Green Datacenter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. One Small Step Toward a Green Datacenter

    Quan Nguyen <quan.nguyen@mcgill.ca> Network and Communications Services McGill University CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  2. Today’s Talk Power consumption trends in the datacenter Our cooling problem Our solution The result Lessons learned: Take time to understand air flow and heat extraction Find one design and stick to it Ad-hoc fixes can adversely affect the solution CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  3. Power consumption trends in the datacenter Jonathan Koomey[1] @ LNBL and Stanford: Power consumption by servers in the US grows by 14% each year In 2005, datacenters in the US burned about 45 billion kWh, 1.2% of 2005’ retail electricity sales or about $2.7 billion Worldwide electricity use for volume servers (OEM) estimated at 1.7% Every kWh of electricity used for processing IT loads translates into another kWh of electricity for the supporting physical infrastructure (UPSes, PDUs, HVAC equipment) CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  4. [Nick Carr] [2] An avatar—an online representation of a human—on Linden Labs’ virtual reality online game Second Life consumes:1,752 kWh per year (about $100/year of electricity)as much power as the average Brazilian over the course of one year CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  5. More trends on power in datacenters 2007 Gartner report[3]: datacenters are responsible for 0.5% of global CO2 emissions. The whole IT and Telecom industry accounts for 2%, equivalent to that of the aviation industry A medium-sized server has roughly same carbon footprint as an SUV doing 15 mpg [4] AFCOM (Association for Computer Operation Managers) and InterUnity Group 2005 survey: 48% felt that they would be incapable of supporting business requirements due to capacity constraints Demand for new application growth will continue to outstrip the processing power increases that flow from Moore’s Law CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  6. Our Cooling Problem Many hot spots (not Wi-Fi hot spots!). Servers near the top are not cooled sufficiently Average room temperature rose from 72F to 77F when one specific cabinet was put in service Frequent CRAC (computer room air-conditioner) failures Our cold aisles are hot and our hot aisles are hot too CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  7. CAR Hot air Hot aisle Cold aisle Airflowdirection CRAC UPS Hot aisle Cool air Cold aisle CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  8. Side view of the datacenter Concrete Floor False Ceiling Hot air CRAC UPS Raised Floor Cool air Concrete Floor CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  9. We soon realized… Recommendations [6] from a reputable engineering firm that designs our building HVAC systems… brute force approach - add another CRAC unit But It will cost a lot (and a lot) to add more cooling capacity We still don’t know how to solve the hot spot problem We still don’t know how to properly set up hot aisle/cold aisle Not more efficient than current system CRAC still fight with CRAC’s for humidity control CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  10. We asked for a second opinion We talked to a firm[7] specializing in datacenter build-out and soon learned: We can make our CRAC units working more efficiently instead of the brute force approach of adding more cooling power (costs much less) There are some things that technology just can’t solve. We need to update our rack and server placement and ad-hoc installation methodology We need a technology to support the new higher density racks. CRAC’s just can’t cool these CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  11. CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  12. A detailed analysis of our problem Our CRAC units are supposed to be able to adequately cool our existing heat load. They simply didn’t work efficiently They were designed for low power density racks and can’t keep up with higher density servers They were designed to cool the room and not necessarily the hot spots CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  13. … detailed analysis of our problem Heat dissipated by the servers could not be absorbed completely back to the CRAC’s Non-captured heat got stuck just below the ceiling creating a “heat fog” phenomena Non-captured heat got mixed back with the cold air making the situation worst CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  14. … detailed analysis of our problem Air is not cold enough when it gets to the top of the rack Our CRAC units are working too hard: 8% more current drawn than normal Increased rate of failures More refrigerant/compressor seizures CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  15. … detailed analysis of our problem Some equipment is placed in the wrong direction blowing heat into the wrong place Too much heat turbulence Air is still very cold under the floor Indication that we still have enough cooling capacity Too many or not enough perforated tiles?(More on this later) CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  16. So what is the ideal situation? CRAC units should work as the specs say within their optimal curves. Intake temperature should be as high as possible New concept is to remove heat and not pump cold Cold aisles stay cold, hot aisles stay hot Servers at top of cabinets receive adequate cooling Which means: No “heat fog” CRAC units must draw less current No heat turbulence CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  17. Too many or not enoughperforated tiles? The big engineering firm said that we didn’t have enough Based on the theoretical limit of our CRAC units, they are supplying more CFM (cubic feet per minute) than the existing perforated tiles allow So we must add more perforated tiles It didn’t help! … more in next slides CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  18. Why adding more perforated tiles didn’t help? There is not enough static pressure under the raised floor. We were able to fix the problem and raise the static pressure by 38% after the work described later The placement of the perforated tiles and the type (50%, 25% perforation) is important. Not all racks have the same heat load. Add those tiles only in higher-density rack area CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  19. Why adding more perforated tiles didn’t help? We are already poking too many holes in the floor mostly under the racks for cable runs. It was equivalent to have 25 floor tiles permanently removed The cold air was released into the hot aisles creating more heat turbulence and starving the racks of cool air Cable runs under the raised floor block the air circulation in some areas CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  20. The fix A series of coordinated efforts recommended by the specialized firm We did spend a bit extra to measure the static pressure and efficiency of our CRAC units before and after the work All the gains were obtained as expected CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  21. 1 – Clean out sub floor Remove all unused cables (from the mainframe days) Tidy up cable runs (or rerun some) Make sure cable paths run in parallel with the airflow from CRAC units CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  22. 2 – Seal the floor Seal almost 100 Sq-feet (260 floor openings) KodLok® brush system CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  23. 2 – Seal the floor Plenaform adhesive sheets CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  24. 3 – Use the false ceiling for hot air return 3A – Install ductwork onto the return air intakes of the CRAC units so that 100% of the return air would be drawn from the false ceiling plenum 3B – Install “egg crate” type return air grills in the false ceiling and position them strategically (mostly in hot aisles and hot spots) CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  25. 3 – Use the false ceiling for hot air return CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  26. 3 – Use the false ceiling for hot air return Concrete Floor False Ceiling Hot air New ductwork CRAC UPS Raised Floor Cool air Concrete Floor CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  27. 4 – Prevent heat leakagebetween rows Do not leave openings in the racks. Install blanking panels Do not leave openings in between cabinets Do not have cabinets of varying heights in the same row CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  28. From APC White Paper #42 CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  29. 5 – Reposition cabinets Avoid blowing hot air into cold aisle or the intake of other equipment (surprising that we have made that mistake) Make sure home-made fans are not mounted in reverse or blowing hot air into cold aisle Avoid fans in any case. They stir up dust CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  30. Measured results Static pressure increased by 38% Average room temperature unchanged but the temperature of the hot air intake into the CRAC units increases by 2.6% Total CFM of cold air coming from the floor increased by 23% (34,400 CFM  42,500 CFM) Current drawn by the CRAC units dropped by 8% CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  31. Estimated Results The return air temperature into the CRAC units increased, resulting in higher cooling capacity and efficiency. We have regained our CRAC efficiency. We estimated that we have regained about 20 tons of cooling capacity (for about 70kW of heat load) Total work cost $50K versus $150K+ for adding equivalent cooling capacity CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  32. In Summary Clean out sub floor Seal the floor Use the false ceiling for hot air return (install ducting from the CRAC into the ceiling) Prevent heat leakage between rows Reposition cabinets CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  33. How to stay green Save energy (and money too) by running efficiently Carefully implement the hot-aisle, cold-aisle scheme Run the CRAC at its optimal performance curve and set the temperature and humidity at the right mid-point (74 F and 50% humidity) Run air conditioners and UPS units at close capacity to be energy-efficient (at least more than 65%). Do not overbuy. Use modular systems CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  34. End Notes [1] Estimating total power consumption by servers in the U.S. and the world http://enterprise.amd.com/Downloads/svrpwrusecompletefinal.pdfhttp://enterprise.amd.com/us-en/AMD-Business/Technology-Home/Power-Management.aspx [2] Avatars consume as much electricity as Brazilianshttp://www.roughtype.com/archives/2006/12/avatars_consume.php [3] KUMAR, Rakesh, et Lars MIERITZ. Conceptualizing 'Green' IT and Data Center Power and Cooling Issues, Gartner, 7 September 2007 [4] An Inefficient Truth, Global Action Planhttp://www.globalactionplan.org.uk/research.aspx [5] The green datacenter: Energy efficient computing in the 21st centuryhttp://searchdatacenter.techtarget.com/generic/0,295582,sid80_gci1273283,00.html [6] Project 05-19-022 – Infrastructure study Data Centre – Burnside Hall, McGill University-restricted, May 4, 2006 [7] Report on the improvement of the Main Server Room conditions Lot 1 – Air distribution, McGill University-restricted, September 4, 2007 CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
  35. CANHEIT | On the EDGE | June 15-18, 2008 | University of Calgary
More Related