Planning & Optimising the Green IT Datacentre:
This presentation is the property of its rightful owner.
Sponsored Links
1 / 41

Green IT Business Transformation Seminar PowerPoint PPT Presentation


  • 60 Views
  • Uploaded on
  • Presentation posted in: General

Planning & Optimising the Green IT Datacentre: Design, Operation & Management Best Practices, Technologies & Challenges Pierre Ketteridge, IP Performance Ltd. Green IT Business Transformation Seminar. Introduction. Yes! Of course… …but only with careful planning, design and management!.

Download Presentation

Green IT Business Transformation Seminar

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Green it business transformation seminar

Planning & Optimising the Green IT Datacentre: Design, Operation & Management Best Practices, Technologies & ChallengesPierre Ketteridge, IP Performance Ltd

Green IT Business Transformation Seminar


Green it business transformation seminar

Introduction

Yes! Of course…

…but only with careful planning, design

and management!


Green it business transformation seminar

Introduction

  • The direct carbon impact (ie Carbon Footprint) of Data Centres on the environment is almost exclusively related to power consumption

  • Data Centres do not (when properly designed and managed) vent hot air or polluting gases into the atmosphere – cooling should be a ‘closed system’

  • There may be indirect carbon impacts through staffing levels, travel to and from site, operational maintenace and housekeeping


Green it business transformation seminar

Introduction

15% of business power consumption is

accounted for by Data Centres & ICT…

Cooling

50%

IT

Components

40%

10%

Power

Distribution

…Lighting accounts for 1-3%,

dependent on whether LO operation

is implemented or not


Cooling falls into two categories air cooling liquid water cooling

Cooling

Cooling falls into two categories:

Air Cooling

Liquid (water) Cooling


Air cooling

Cooling> Air Cooling

Air Cooling

The traditional way of cooling a Data Centre Computer Room:

CRAC (Computer Room Air Conditioner)

Water Chiller

Cold Aisle/Hot Aisle Configuration


Inherent limitations of crac based air cooling systems

Cooling> Air Cooling

Inherent limitations of CRAC-based Air Cooling Systems:

CRAC capacity needs to be 30% greater than the actual demand

Limitations in cooling (5kW – 7kW per rack)

N+1 active equipment resilience/redundancy drives efficiency of cooling system down further


Some easy to implement air cooling optimisation suggestions

Cooling> Air Cooling

Some Easy-to-Implement Air Cooling Optimisation Suggestions:

Hot Aisle/Cold Aisle Arrangement

Cold Aisle Containment

Blanking Panels

Raised Floor Brush Strips

Underfloor, Inter- and Intra-rack Cable Management

Free Air Cooling


Green it business transformation seminar

Cooling> Air Cooling> Hot Aisle/Cold Aisle

  • With no hot aisle/cold aisle

  • arrangement, returning heated

  • air mixes with the CRAC-cooled

  • air and cooling to the DC CR

  • equipment is impaired. There is

  • also the issue of bypass cold

  • airflow, which can impact chiller

  • operation.

  • With a hot aisle/cold aisle arrangement, chilled air is forced out into the front-of-cabinet facing cold aisles, across the equipment surface, and warm air is channeled out into the rear-of-cabinet facing hot aisle for return to the chiller/CRAC.


Green it business transformation seminar

Cooling> Air Cooling> Hot Aisle/Cold Aisle

  • Ineffective positioning of CRACs impair the

  • airflow around the DC CR.

  • CRACs along the side walls are too close to the

  • equipment racks, and will cause the airflow to

  • bypass the floor vents in those cold aisles.

  • Place cooling units at the end of the equipment

  • rows, not mid-row.

  • CRACs should be aligned with the hot aisles to prevent hot/cold aisle airflow crossover, which apart from increasing the temperature of air supply to the rack fronts but also can trigger the cooling unit to throttle back, reducing cooling overall.

  • Limit maximum cooling unit throw distance to 50'


Separation of high density racks

Cooling> Air Cooling> Hot Aisle/Cold Aisle

Separation of High-density Racks

  • Air cooling systems become ineffective

  • when high-density racks are co-located

  • “Borrowing” of adjacent rack cooling

  • capacity is not possible in this

  • configuration

  • An alternative (other than self-contained

  • cooling) is to spread out high-density

  • racks to maintain the cooling averages

  • Obviously this is not always practical –

  • witness the prevalance of blade server and

  • virtualisation technologies – two to five

  • times the per rack power draw of

  • traditional servers


Green it business transformation seminar

Cooling> Air Cooling> Cold Aisle Containment

ColdAisle Containment

  • Very simple to deploy / Retrofit

  • Hot and cold aisles physically separated

  • Greater watts per rack approx 10kW

  • Over sizing of the CRAC is reduced

  • CRAC efficiency is increased due to a higher delta T

  • CRAC fan speed can be reduced which provides:

    • - Reduced running costs

    • - Increased MTBF


Green it business transformation seminar

Cooling> Air Cooling> Blanking Panels

  • Reduction and stabilization of equipment air-intake temperatures

  • • Elimination or reduction of the number and severity of hotspots

  • • Increased availability, performance, and reliability of IT equipment, especially in the top one-third of the equipment cabinet

• Elimination of exhaust air recirculation within the cabinet, optimising cooling and reducing energy consumption and OpEx

• Deferral of CapEx (additional cooling capacity)

• The potential of greening the data center by reducing its carbon footprint


Raised floor brush grommets

Cooling> Air Cooling> Raised Floor Brush Strips

Raised Floor Brush Grommets

  • Cable openings allow approx. 60%

  • of conditioned air to escape

  • Use brush grommets to seal every

  • cabling entry point

  • Increases static pressure in the

  • under-floor plenum - ensures that

  • the DC airflow remains at a

  • pressure above atmospheric

  • Extend reach of Hot Aisle/Cold Aisle

  • system

  • Self-sealing and interwoven closure system

  • Brush grommets can be installed as DC is commissioned, or retro-fitted

  • No changes to existing wiring configuration

  • Fits into the raised floor tiles prior to cabinet installation

  • Simple

  • Inexpensive


Cable management intra rack inter rack and underfloor

Cooling> Air Cooling> Cable Management

Cable Management – Intra-rack, Inter-rack and underfloor

  • Airflow within racks is also affected by unstructured cabling arrangements

  • Deployment of high-density servers creates new problems in cable management

  • Cut data cables and power cords to the correct length – use patch panels where appropriate

  • Equipment power should be fed from rack-mounted PDUs

  • Raised floor/subfloor plenum ducting carries other services apart from airflow:

    • Data cabling, power cabling, water pipes/fire detection & extinguishing systems

  • Remove unnecessary or unused cabling - old cabling is often abandoned beneath the floor – particularly in high churn/turnover Co-Lo facilities

  • Spread power cables out on the subfloor - under the cold aisle to minimize airflow restrictions

  • Run subfloor data cabling trays at the stringer level in the hot aisle - or at an “upper level” in the cold aisle, to keep the lower space free to act as the cooling plenum


What is free cooling

Cooling> Air Cooling> Free Air Cooling

What is Free Cooling?


Average uk temperatures

Cooling> Air Cooling> Free Air Cooling

Average UK Temperatures


Budgetary example projected cost of running the system for a year

Cooling> Air Cooling> Free Air Cooling

Budgetary Example – Projected Cost of Running the System for a Year

  • Not using the Free Cooler

  • Chiller Capacity 150 kW

  • Energy needed to run the chiller 62 kW

  • Numbers of Hours running per year 8784

  • Cost per kWh £0.0784

  • Total Cost of Running per Year £42,697.00

  • 100% free cooling 70% of the year

  • Chiller capacity150 kW

  • Energy needed to run the chiller 62 kW

  • Numbers of hours running per year 2580

  • Cost per kWh£0.0784

  • Cost of running the chiller £12,540.00

  • Cost of running Free Cooling (10.4kw) £ 5,058.00

  • Total Cost of Running per Year £17,599.00


Green it business transformation seminar

Cooling> Liquid Cooling

High Density Data Centres and Liquid Cooling

  • When going above 10kW per rack a new, more targeted/directed cooling method is required

  • Most common methods is Water Cooling


So what is liquid or water cooling

Cooling> Liquid Cooling

So What is Liquid – or Water – Cooling?

  • Delivery of chilled water to multiple heat exchange points from a central unit

  • The central unit circulates water from the buildings existing chilled water loop

  • Heat exchange units in rear doors (one per cabinet, capacity 30kW) or side doors (2 x dual cabinet resilience, 2 x 15kW)

  • Heat is carried away in the water - air is ejected back out into the DC at the same temperature it entered the rack - zero thermal footprint


Why use water cooling

Cooling> Liquid Cooling

Why Use Water Cooling?

  • Water 3,500 times more thermally efficient than air

  • Air cooling only delivers 5-7kW of cooling per rack (10kW with hot aisle/cold aisle arrangement)

  • High Density DCs place increasing power and thermal control demands on the infrastructure

  • Blade servers - up to 80 servers in a standard 42u cabinet – and anything from 80 to 800 virtual machines!

  • Fully-loaded blade server rack can produce 25Kw of heat

  • Water Cooling can deliver 30kW of cooling to a fully-loaded 42u rack


Adding the benefits of free cooling some capex opex implications of water cooling

Cooling> Liquid Cooling

Adding the benefits of Free Cooling, some CapEx/OpEx implications of Water Cooling:

  • Water cooling has a slightly higher install cost (more terminations/ pipe work)…but greater kW per sq ft gives us…

    • 35-45%reduction in required real estate

    • 15-30%reduction lower in overall construction costs

    • 10-20%reduction on total annual fan power consumption

    • 12-14%reduction in power delivered to mechanical chilled water plant

  • For an average efficiency data centre, annual savings of £22,000 and £80,000 for small and large data centres respectively

  • Significant when the design life of the data centre is 10 years

  • Reduction in energy is a reduction in costs and also a reduction in your carbon footprint


Active equipment networking

Network Components

Active Equipment (Networking)

  • Switches

  • Routers

  • Appliances

    • Load balancers

    • Caching/Proxying

    • Bandwidth Management

    • Application Acceleration & Optimisation


Data centre switch requirements

Network Components> Ethernet LAN Switches

Data Centre Switch Requirements

  • Port density

  • Performance

  • Functionality

  • Feature set

  • Resilience/Redundancy

  • Security

  • Price

  • Power consumption/Heat output

Feeds & Speeds

Capabilities


Data centre switch requirements1

Network Components

Network Components> Ethernet LAN Switches

Data Centre Switch Requirements

  • High port density per chassis

  • Low power consumption

    • Availability

    • High performance

    • Low latency

Optimised for the environment

Optimised for the application


Green it business transformation seminar

Network Components

Network Components> Ethernet LAN Switches


Ethernet switch power consumption a comparative example 15 000 user network

Network Components

Network Components> Ethernet LAN Switches

Ethernet Switch Power Consumption - A Comparative Example: 15,000 User Network

  • Across an installed network base of 15,000 ports, it was possible to save 102 kW/h, resulting in:

  • Lower Power Consumption

  • Less Cooling Equipment

  • Smaller Batteries

  • Smaller Data Centers


Routers

Network Components> WAN Routers

Routers

  • Look at power consumption figures/thermal output

  • Deploy shared WAN architecture – MPLS, VPLS, IP VPNs

  • Investigate leveraging and integrating bandwidth optimisation and application acceleration technologies


Lan wan optimisation appliances

Network Components> Appliances> Load Balancing

LAN/WAN Optimisation Appliances

  • …an area where we can make a difference, in the way in which technologies are deployed to optimise LAN/WAN bandwidth usage and availability of back-end servers.

  • An excellent example would be application delivery, traffic management and web server load balancers:

  • High Performance through acceleration techniques

  • High Availability


More lan wan optimisation options

Network Components> Appliances> DPI Bandwidth Management

More LAN/WAN Optimisation Options…

DPI Bandwidth Management solutions:

  • Inspection, Classification, Policy Enforcement and Reporting on all traffic:

    • Identification - application signature, TCP/UDP port, protocol, source/destination IP addresses, URL

    • Classification – CoS/ToS (IP Prec/Diffserv CodePoint/DSCP); user-defined QoS policy

    • Enforcement based on user-defined policy

    • Reporting – RT and long-term – extremely valuable for SLAs/SLGs in DC environments


Lan wan optimisation options cont d

Network Components> Appliances> WAN Optimisation

LAN/WAN Optimisation Options (cont’d)

WAN optimisation and application acceleration:

  • Usually deployed as a reverse proxy device

  • Provides some form of bandwidth management

  • Protocol optimisation – making LAN protocols more latency-tolerant

    • eg. TCP handshake spoofing

  • Object caching

    • Files, videos, web content, locally cached and served

  • Byte caching

    • Repetitive traffic streams, hierarchically indexed and tagged (inline only)

  • Compression

    • (inline only)

  • Proxy support for common protocols

    • HTTP, CIFS, SSL (HTTPS), FTP, MAPI, P2P, MMS, RTSP, QT, TCP-Tunnel, DNS etc


Lan wan optimisation options cont d1

Network Components> Appliances> WAN Optimisation

LAN/WAN Optimisation Options (cont’d)

WAN optimisation and application acceleration:

  • Reverse Proxy

  • Bandwidth Management

  • Protocol optimisation – for latency-intolerant LAN protocols

    • eg. TCP handshake spoofing

  • Object caching

  • Byte caching

  • Compression (inline only)

  • Proxy support for all/most common protocols


Managing the data centre infrastructure

Infrastructure Management

Managing the Data Centre Infrastructure

  • “Lights Out” operation requires…

    • Little or no human intervention

    • Exceptions:

      • Planned maintenance

      • Fault rectification/management (emergency maintenance/repair)

      • Physical installs/removals

      • Housekeeping (cable management, MAC)

      • Cleaning

    • How are you going to control it? How are you going to manage it?


Remote control and management

Infrastructure Management

Remote Control and Management

  • RDC, VNC – In Band Management

  • Console Servers – Out of Band Management

  • KVM switching (local/remote)

  • KVM/IP switching & USB2 VM Remote Drive Mapping

  • IPMI Service Processor OOB Management

  • Intelligent Power Management (iPDUs)


Green it business transformation seminar

Infrastructure Management


Summary cooling

Summary

Summary - Cooling

  • Data Centre “Greening” is mainly down to managing power consumption

  • Cooling is the biggest consumer of power (50%)

  • Optimise your air-cooled CRAC system:

    • Cold Aisle/Hot Aisle arrangement

    • Cold Aisle containment

    • Blanking Panels

    • Raised floor/underfloor brush strips/grommets

    • Free air cooling system


Green it business transformation seminar

Summary

Summary – Cooling (Cont’d)

  • If deploying high-density bladeservers/virtualisation, consider water-cooling (max kW/hr cooling rises from 5-10kW/hr to 30kW/hr)

  • Targeted control

  • Even distribution of cooling

  • Full (42u) rack utilisation

  • Zero thermal footprint – design flexibility

  • Remember free air cooling reduces costs further

  • Real Estate savings


Summary active equipment networking

Summary

Summary - Active Equipment (Networking)

Switches:

  • high port density, low power consumption, PSU disconnect/fanless operation

  • Extrapolate power consumption over entire port count

    Routers:

  • Modular architecture, high density, low power consumption

  • Make full use of available bandwidth

    • Shared services: IP VPN, point-to-multipoint or meshed MPLS

    • Use/honour QoS marking

    • Deploy Bandwidth optimisation techniques


Summary active equipment networking cont d

Summary

Summary - Active Equipment (Networking) – Cont’d

Appliances:

  • Load Balancing – Maximise performance, utilisation and availability of server resources

  • DPI Bandwidth Management

  • WAN Optimisation

Maximise performance,

utilisation and availability

of WAN resources


Green it business transformation seminar

Summary

Summary – Infrastructure Management

  • Remote Infrastructure Control and Management enables “lights-out” operation

  • Remote console management gives CLI access to network infrastructure – routers, switches, firewalls, other network optimisation appliances

  • KVM-over-IP allows remote, distributed control of server and workstation systems

  • Service Processor Management allows remote control and management of system processor and environmental monitors/controls

  • Intelligent Power Management enables remote monitoring, control and management of PDUs, UPS and battery backup resources


Thank you

Close

THANK YOU

Pierre Ketteridge, IP Performance Ltd

[email protected]

[email protected]

www.ip-performance.co.uk


  • Login