1 / 29

Vette LiquiCool TM Solution

Vette LiquiCool TM Solution . Rob Perry Executive Manager. Arlene Allen University of California Santa Barbara Director, Information Systems & Computing. 46. Data Center Trends - Staggering Energy Consumption and Cost of Energy.

dieter
Download Presentation

Vette LiquiCool TM Solution

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vette LiquiCoolTM Solution Rob Perry Executive Manager Arlene Allen University of California Santa Barbara Director, Information Systems & Computing

  2. 46 Data Center Trends - Staggering Energy Consumption and Cost of Energy • Energy unit price has increased an average of 4% YOY in the USA and 11% YOY Globally • Data Center energy consumption is growing by 12% annually Source: EPA 2007 Report to Congress

  3. 46 Data Center Trends – Operating Expense Exceeds Capital Expense in less than 1 year • Data Center facility costs are growing 20% vs. IT spend of 6% • Operating costs over lifetime of a server ~ 4X originalpurchase cost • Cooling infrastructure can consume up to 55% of Data Center energy Source: Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007)

  4. UCSB – “The Problem” • UCSB’s existing Data Center is being renovated for research computing and is forcing the corporate/miscellaneous IT equipment into a new space. • This new space is not designed to be a Data Center. The footprint is small, the power is limited by existing building wiring and using traditional air-cooling topology is not feasible. • The new space limitations requires the load density to increase from a typical density of 6kW or less to a higher density of 10-16kW per rack

  5. LiquiCool - “The Solution”

  6. LiquiCool - “The Solution” LiquiCool™ – A complete cooling solution for the consolidation and scale-out of compute infrastructure in today’s sustainable Data Centers

  7. LiquiCool - How does it work? • Based on IBM IP & Technology Licenses (>30 years of water cooling experience) • Rear Door Heat Exchanger (RDHx) replaces existing rear door of IT enclosure • RDHx has chilled water Supply & Return quick connections at bottom OR top • Raised floor becomes optional • Chilled water circulates through tube+fin coil from Supply connection • Equipment exhaust air passes through coil and is cooled before re-entering the room Fin + tube Heat exchanger Front of Enclosure Rear of Enclosure Cold Supply Water Heated Water

  8. LiquiCool System • Passive RDHx provides 100% sensible cooling • No condensation, no need for reheat or humidification • CDU creates a fully isolated, temperature controlled Secondary Loop • Chilled water source - city water, building chilled water, packaged chiller… Temperature:10-17oC 50-63oF Water pressure:30-70 psi Temperature:7oC / 45oF Water pressure:100-200 psi 10 10

  9. RDHx - External View • Passive • No electrical connections • No moving parts • No Fans • No power • No noise • Attaches to rear • No need to rearrange racks • Does not consume valuable floor space, adds 4-6” to rear • Close-coupled • Neutralizes at the source Top Feed Connections Bottom Feed Connections

  10. RDHx - Internal View Protective barrier Air-bleed valves Bottom Feed Hose Connections and drain valve Tube & Fin coil

  11. Thermal Image - Before & After100% Heat Neutralization

  12. RDHx Cooling in Action Temperature readings taken in the rear of a fully populated Enclosure Rear Door Heat Exchanger Door opened Server Leaving Temp 102ºF (38.9ºC) Rear Door Heat Exchanger Door closed Server Leaving Temp: 74ºF (23.5ºC) RDHx reduces Leaving Temperature by 28ºF (15.4ºC)!

  13. RDHx is Compatible with most major IT Enclosures Industry Standard Enclosure Mount Transition Frame (if needed) Remove existing rack rear door & hinges

  14. RDHx General Specifications • Max. Cooling Capacity: 33kW • Coolant: Chilled Water (above dew point) • Dimensions: 76.6“ H x 4.6“ D x 23.6“ W (1945mm x 117mm x 600mm) • Weight – empty: 63lbs (29kg) • Liquid Volume: 1.5 Gallons (5.7 Liters) • Liquid Flow Rate: 6-10 GPM (23-38 L/min) • Head Loss: 7 psi (48 kPa) at 10 GPM (38 L/min) • System Input Power: None required • Noise: None • Couplings: Drip-free stainless steel quick- connects • Connection Location: Bottom or Top Feed

  15. Coolant Distribution Unit (CDU) • Power Consumption: 2.6 kW • Pump Capacity: 63 GPM at 30psi (240 L/min at 207 kPa) • Primary Head Loss: 10.2 psi at 63 GPM (70 kPa at 240 L/min) • Minimum Approach Temperature (100% load): • 120kW unit - 12°F (6.7 °C) • 150kW unit - 8°F (4.4 °C) • 63 GPM (240 L/min) on primary and secondary • Water to water heat exchanger with pumps, controls and chilled water valve • Creates an isolated secondary cooling loop • 100% sensible cooling, no condensation • Small water volume (tens of gallons) • Easier to control water quality • Redundant, fault-tolerant design • 120kW or 150kW capacity • Supports 6-12 RDHx • Optional internal manifold for quick expansion • SNMP & ModBus communications

  16. CDU Simplified

  17. Floor-mount CDU Internal - Front Controller Brazed plate heat exchanger Inverter drive Redundant valves Reservoir tank Redundant variable speed pumps Casters and Drain

  18. Floor-mount CDU Internal - Rear Optional Secondary Loop Distribution Manifold Primary side water filter Primary supply and return connections Optional Secondary Loop Flex Tails

  19. Hose Kits & External Manifolds • Connects to flex tails on CDU secondary side • ISO B or Sweated Connections • Standard & custom configurations • Each Vette Hose Kit consists of a flexible Supply hose and a Return hose • Factory assembled and tested to IBM specifications and standards • Quick-connect drip-free couplings on one end OR both ends • Straight hoses for raised floor environments, right angle hoses for non-raised floor environments • Standard lengths from 3ft. to 50ft.

  20. Water Treatment Treatment of Cooling Water • Potential Effects of Non-Treatment • Loss of heat transfer • Reduced system efficiency • Reduced equipment life • Equipment failures or leaks • De-ionized water without inhibitors is corrosive! SCALE FOULING MICROBIO CORROSION

  21. Scenario I – Out of Space Add RDHx – Double your load per rack Eliminate CRAC units 56% Recovery of White Space!

  22. Scenario II – Out of Power/Capacity Add RDHx Remove (2) CRAC units Reduces cooling energy consumption to free up capacity for growth

  23. Scenario III – High Density Adding RDHx allows 8X the Compute Power! CRAC units can typically provide efficient environmental control for rack densities of up to 5kw per rack

  24. Reference Sites Warwick University, Coventry, UK National Center for HPC, Taiwan

  25. Reference Sites Front view Rear view Georgia Tech Super Computing Facility - 12 racks at ~24kW each

  26. Silicon Valley Leadership Group Case Study - Modular Cooling Systems

  27. SVLG “Chill Off” Results Vette’s LiquiCool™ solution led the field in cooling capacity and in cooling efficiency! Vette

  28. LiquiCool - Conclusive Savings for Energy, Space & Cost • Largest % of Data Center OPEX growth is power & cooling related • Cost of energy for cooling is a large (and growing) cost component • Data Center consolidation, virtualization and advanced hardware technology are driving higher power densities per rack and associated white space constraints • Traditional air-cooling is less likely feasible • Purchasing decisions can no longer be made solely on CAPEX • TCO must not only be considered, but is core Value Summary: • Reduces white space requirements by more than 55% • Cuts cooling energy consumption by 50% or more when compared to traditional air-cooled Data Centers • Allows 8X the amount of compute power in a typical IT enclosure • Lowers carbon footprint by more 50% or more vs. air-cooling • Bottom Line: Payback in less than 1 year when compared to traditional computer room air-conditioning

  29. EndThank You

More Related