1 / 28

How Microsoft Does Data Centres

How Microsoft Does Data Centres. John Dwyer Area Data Centre Mgr - International Data Centre Solutions. The Global Services Foundation Across the company, all over the world, around the clock. Azure. Sharepoint.Microsoft.com. Office Labs . Zurich .Net Online.

jesse
Download Presentation

How Microsoft Does Data Centres

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Microsoft DoesData Centres John Dwyer Area Data Centre Mgr - International Data Centre Solutions

  2. The Global Services FoundationAcross the company, all over the world, around the clock Azure Sharepoint.Microsoft.com Office Labs Zurich .Net Online Plus over 150 more sites and services MBS Online Yellow box or text = pipeline

  3. Scale and Market Growth Source: http://www.internetworldstats.com 3

  4. Data Centre Economics Have Changed! • Cost of physical space was a primary consideration in data centre design • Cost of power and cooling has risen to prominence • Data centre managers now must prioritize investment in efficient power and cooling systems to lower the total cost of operating (TCO) of their facilities. Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007) 4

  5. Site Selection Internet Population Internet Peering / Network Mobile Users Power Pricing Environmental Construction Costs Tax Climate IT Labor Availability Corporate Citizenship CompositeHeat Map 5

  6. Why Power Matters… • In 2006, U.S. data centres consumed an estimated 61 billion kilowatt-hours (kWh) of energy, which accounted for about 1.5% of the total electricity consumed in the U.S. that year. • In the EU, data centres consumed an estimated 56 billion kilowatt-hours in 2007. • As an Industry Segment, Data Centres are the fastest growing Energy Segment in the US. • Current projections are that data centre power consumption will exceed 100 billion kWh by 2011 in the US and by 2020 in the EU. • Those levels of power consumption in the US would necessitate the construction of 10 additional power plants. 6

  7. Relevant Metrics at Microsoft • PUE/DCiE • DC Utilization • Server Utilization • Cost • Move from Cost = f(space) Cost = f(power) The Green Grid X 100%

  8. SCRY 8

  9. Setting Aggressive PUE Targets 9

  10. Environmental Control Standards 10

  11. Where Data Centre Power Goes Opportunities are Everywhere GFS’ Infrastructure Services is focusing on all the pieces of the pie Source: EYP Mission Critical Facilities Inc., New York 11

  12. Where Data Centre Power Goes Opportunities are Everywhere Offline UPS technologies can drive this substantially down Virtualization, active power management Widening environment can remove chillers and drive this to zero GFS’ Infrastructure Services is focusing on all the pieces of the pie Source: EYP Mission Critical Facilities Inc., New York 12

  13. Three of our Data Centres 13

  14. Data Centre Costs in the US • Land - 2% • Core & Shell Costs – 9% • Architectural – 7% • Mechanical / Electrical – 82% • Since 2004 -16% increase year-to-year Where the costs are: >80% scale with power <10% scale with space 14

  15. SCRY – Window to Our World 15

  16. SCRY Helps Demonstrate Continuous Improvement On existing data centres and helps set goals for new data centres at Microsoft 22% improvement over 3 years Follows Moore’s Law

  17. Where We Think Things Are Going … 17

  18. 18

  19. Futures – Containers (Chicago)

  20. Why We Like Containers • Can deploy at scale • Plug and Play • Drives Innovation • Abstracts away religious wars in competitive bid • AC vs DC • Air vs Liquid • Cost can include maintenance • Allows for easy cost and performance measurement • Creates an environment to drive competition around efficiency • One throat to choke • Question: • Is this water cooling or air cooling?

  21. Container Solutions • Use a standard ISO (40’, 20’, 10’ x 8 x 8’6”) shipping container to hold servers • Portability allows the delivery and operation of servers in self -contained units • Move costs from long lead to short lead equipment, increased ROI Capital • Optimize server Delivery at 1000U+ as a unit vs. 40+U in a rack and a Single SKU & Warranty • Containers seen as a solution to burst demand and temporary • Microsoft’s approach is different –use them as a primary packaging unit Cost: It costs less to ship 2,000+ servers in one container than it does to ship and then install individual racks. Additional savings come from not needing raised floors or fans for each server, and requiring a lot less wiring, packaging and shipping. 22

  22. Container Solutions • The container gives us the opportunity to test new technology such as increased supply air temperature, removal of fans from servers, managed airflow with hot aisle containment, significantly increased WPSF and more efficient electrical distributions • Microsoft have stand alone units running in production today and are running proof of concepts on newer technology Energy efficiency:  At more than 1,000 Watts per square foot, containers allow us to power a lot more servers in a given area. This is the key reason containers have become economically viable. PUE numbers tested in a POC measured at peak ~1.3 23

  23. Container POC GFS DCS Ran a Proof of Concept on a Container System in Seattle, Washington, USA • PUE came in between 1.2 and 1.3 é • Ran unit up to full load measured at 107 kW = 178 watts per server é • Dropped power to unit and came back online no issues é • Fans and servers ran at full speed for 7 minutes on batteries é • Container spec completed and vendor RFP underway é • Batteries temp remained at 75F using probe and IR camera. Back is exposed to 85 F Need to place a temp probe at rear of battery è • Ambient air temp had a large effect on temp inside the trailer due to no insulation è • Permit and UL inspection took 90 days to obtain è • Harmonics above 15%: varies across all 3 phases ê 24

  24. Virtual Earth Case Study • Timeline started with container in late August • PO approved in October • First unit Live in January • 5 months from Engineering to Live • Delays encountered: • 1 month: Flood plain re-plan • New location, elevated foundation • 1 month: Excell energy transformer install • 1 month: Final concrete delayed due to snow • Actual planning, permitting, construction effort totaled about 3 months • First container Jan 5th, third container Feb 2nd • Container vendor committing to 6-8 week turn around on order long term 25

  25. Containers - Chicago Data Centre Top Floor: 10.8MW Traditional COLO Capacity Ground Floor: 20 MW Container Capacity • Microsoft Container: • 2400 Servers • 375 KW • Standard 40 foot shipping container • Target PUE 1.25 Elevated Spine Connection 26

  26. But More Change Is Coming… 27

  27. Generation 4 Modular Data Centres • Challenging Many Data Centre Conventions • Prefabrication of Shell and M&E Plant • Pre-Assembled Containers • Power Density > 1,000 Watt / Square Foot • Totally Flexible Configuration (Classes) • PUE < 1.1 (depending on Class) • 3-5 Month Time-to-Market • Reduced Cost of Entry • Applying the Model-T Approach • http://loosebolts.wordpress.com • http://blogs.technet.com/msdataCentres/ 28

More Related