1 / 20

herts.ac.uk/about-us/facilities/learning-resources/about-lis/rare-idc/conference.cfm

Paint IT Green A Guide to Lowering Your ICT Carbon from Data Centre to Desktop. www.herts.ac.uk/about-us/facilities/learning-resources/about-lis/rare-idc/conference.cfm. Paint IT Green A Guide to Lowering Your ICT Carbon from Data Centre to Desktop. Welcome To The University of Hertfordshire

faye
Download Presentation

herts.ac.uk/about-us/facilities/learning-resources/about-lis/rare-idc/conference.cfm

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Paint IT GreenA Guide to Lowering Your ICT Carbon from Data Centre to Desktop www.herts.ac.uk/about-us/facilities/learning-resources/about-lis/rare-idc/conference.cfm

  2. Paint IT GreenA Guide to Lowering Your ICT Carbon from Data Centre to Desktop Welcome To The University of Hertfordshire Professor Di Martin, Chief Information Officer

  3. Paint IT Green Conference ProgrammeRichard Smeeton, Chief Technology Officer, UH 10:00 Keynote Speaker: Dr Ian Bitterlin, CTO, Prism Power, 'Shared services': increased energy efficiency via consolidation in larger facilities 10:30 Break & Refreshments 11:00 Steve Phipps, Data Centres Manager, UH, A Best Practice Guide to Greening the Micro Data Centre 11:40 Katherine O'Brien, Environmental Coordinator, UH, Useful Strategies for Improving Environmental Performance 12:00 Pete Sands, Lead Data Centre Design Consultant, Future-Tech SCI, The Cost of Data Centre Ownership 12:30 Lunch 13:30 Howard Noble, Principle Investigator, Oxford University, "When Should I Do My Bit” (Workshop) 14:00 Steve Phipps, Data Centres Manager, UH, Providing a Green Benchmark 14:30 Mark Johnson, Consultant, AEA Group, What CRC means for HE/FE/Public Sector and what to do about it 15:00 Break & Refreshments 15:30 Panel Discussion Chair: Professor Andrew Starr, Director of the Centre for Sustainable Communities, UH, Panel: Barry Lewington, 16:00 Richard Smeeton, CTO, UH, Wrap Up of the Day 16:30 - 18:00 Networking / Refreshments

  4. Next PresentationKeynote – Dr Ian Bitterlin

  5. 'Shared services': Increased energy efficiency via consolidation in larger facilities? Dr Ian F Bitterlin PhD BSc(Hons) BA DipDesInn MCIBSE MIET MIEEE MBCS MBIFM Prism Power Ltd, UK www.prismpower.co.uk

  6. Three steps to Sustainability • Reduce consumption • The social & economic value of the data processed? • Improve efficiency • Not just PUE but also IT software & hardware • Use energy from renewable sources • Building a ‘legacy’ datacentre next to a hydro-electric scheme is NOT a sustainable design, it is a waste of valuable green energy on an inefficient data centre

  7. ‘Value’ of the traffic? Rise of the ‘Hyper Giants’: Five years ago, Internet traffic was proportionally distributed across tens of thousands of enterprise managed web sites and servers around the world. Today, most content has increasingly migrated to a small number of very large hosting, cloud and content providers. Out of the 40,000 routed end-sites in the Internet, 30 large companies – “hyper giants” like Limelight, Facebook, Google, Microsoft and YouTube – now generate and consume a disproportionate 30% of all Internet traffic. Arbor Networks, 13th October 2009

  8. Targets should be IT & Cooling?

  9. IT hardware efficiency = 0%? Worldwide chip utilisation = ~5% IT power = chip, memory, drives, I/O, power conversion & fans At ~5% IT ‘load’ the average server draws >60% power

  10. Where the power goes … PUE=1.6 5 kW 15 kW 240 kW 500 kW 35 kW 3 kW 2 kW Total 800 kW Ventilation – Fresh Air Lighting & small power 1MVA Cooling fans, pumps & compressors IT load Distribution & UPS losses Security, BMS Communications PUE = 800kW/500kW = 1.6

  11. What is ‘the’ PUE?

  12. Key factors for high efficiency • High efficiency servers & software • Heavy Virtualisation • Multiple applications per server • High load Vs capacity, M&E • Free-cooling or Fresh-air cooling • Relaxed set-points for temperature + humidity • Strict air-flow management • High efficiency UPS

  13. Free-Cooling – not ‘free’, but cheap

  14. Free Cooling Vs temperature

  15. Electrical systems: Efficiency Vs Load

  16. Easy to do on a large scale Leverage infrastructure, NOC etc Modularise Consolidate the load(s) Build rooms and you grow Manage the load Vs capacity Be clever with the redundancy strategy

  17. Hard to do on a small scale? • Predict the system capacity to get the end-game plan ‘right’ • Room will generally end up too large running at too low a load with the ‘future’ in mind • Keep the UPS system load high without having too much modularity • Likely to be running at 30% load & 80% efficiency • Introduce free-cooling • Most likely to be ‘split’ air-con with CoP of 3-4 instead of 1.15-1.30

  18. What could the impact be? • 100 small data-rooms, 10 cabinets of 2kW each, 20kW IT load but designed for 50kW, each operating at PUE3 of 2.5 • Total demand = 5MW • One colo-facility of 200 cabinets of 6kW operating at PUE3 of 1.3 • Total demand = 1.6MW • c70% reduction

  19. Questions?

  20. Next Break and Refreshments Next PresentationSteve Phipps

More Related