1 / 33

Schoolcraft College Data Center

Schoolcraft College Data Center. Tier 3+ . Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria Abstract:

betty
Download Presentation

Schoolcraft College Data Center

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Schoolcraft College Data Center Tier 3+

  2. Title: Design of a Shared Tier III+ Data Center: A Case Study with Design Alternatives and Selection Criteria Abstract: Schoolcraft College is constructing a High Availability (HA) Data Center that is targeted at an Uptime Institute Tier Rating of III+. The new center will provide colocation hosting services to the education, municipal, and commercial communities. The design criteria of this facility include substantial redundancies in the power, cooling, network, and security areas. For example, the power infrastructure redundancy includes a continuous-duty natural gas generator that will supplement the DTE public power grid, such that campus power capacity should remain flat even when this 150+ rack facility is at capacity. The data center is also being designed to give students first-hand exposure to the skills needed to design and operate such a high-performance facility, without compromising security or uptime for municipal, commercial, and educational institution customers. The design criteria will be presented, alternative designs discussed, and final selections presented and rationalized.

  3. Data Center Colocation Philosophy • Space is money • Reputation is success or failure • Customer service must be paramount • PUE drives profitability/cost containment • Design flexibility

  4. Understanding Uptime Institute Definition of a Tier 3+Data Center • Concurrently maintainable • Redundant Capacity components • Multiple independent distribution paths • One distribution path required to serve computer equipment at any time • Dual powered IT equipment • Twelve hours of on-site engine generator fuel storage for N capacity • Additional Tier 4 Features • Fault tolerant • N capacity power/cooling available after any infrastructure failure • Multiple independent active distribution paths

  5. Design Goals • 100% Uptime • Tier 3+ rated data center • PUE ↠ 1.5 • Minimum of N+1 redundancy in all critical systems • Private power generation • 2N back-up power • Carrier neutral with multiple carriers Basic Co-Location Space Power/Cooling Access to Carriers Security System Design Disaster Recovery Lease Equipment Maintenance Remote Hands OS Mgmt. And Beyond… DBMS

  6. Design Goals - Continued • High density power – average 5kw/rack • 24/7 monitoring • 24/7 client access • Latest high efficiency HVAC system • Inert gas primary fire suppression system Dual authentication physical security with biometrics

  7. Power – Design Alternatives • CHP Generator vs. Fuel cell • Single vs. dual utility feeds • # of failover protections • Single vs. dual backup power • UPS Battery vs. Flywheel • UPS Tier III vs. IV (A+B) • Busway vs. conduit/wire • Branch circuit monitoring • Panel vs. Buswayvs. PDU • Transfer switch vs.PLC failover management • On site load bank vs. Maintenance service

  8. Data Center Power Plant

  9. Power Deployment 3.) To inside the Rack… 1.) From the Row… 2.) To the Rack…

  10. Network – Design Alternatives • Single vs. Multiple carriers • Single vs. Multiple entry points (Diverse Entry) • Single vs. Multiple carrier paths (Diverse Path) • Single vs. Dual lateral connection per carrier • Entry pathways owned vs. Carrier owned • Single vs. Dual core carrier routers to MDF • Single vs. Dual edge switches (HSRP) • Cisco 6500 series vs. Nexus with SDN • Carrier Battery plant vs. Operator A-B UPS

  11. Data Center Networking Redundancy

  12. SCDC and MERIT Network Relationship Merit will re-sell SCDC colocation services Merit delivers a 150Mbps into the Applied Sciences via an AT&T EVC Merit Networks services will come into the SCDC via AT&T or Level3 “last mile” with whom they already have a relationship. Merit has approximately 3000 miles of fiber network in Michigan. Merit has relationships and fiber access to area public school districts and universities as potential SCDC clients

  13. Cooling Considerations • Cooling 101 – Get air to front of device to allow device fans to pull air to the back • Cooling is largest “non-IT” power usage • Leakage – Open spaces create air mixing, turbulence, loss of efficiency • Design layout CFD failure mode verified

  14. Cooling – Design Alternatives • Area/perimeter (blowing up the balloon) • Ducted supply and/or ducted return • Raised floor, In row, Economizers, etc. • Hot aisle vs. Cold aisle containment • DX (Air), Glycol, Chilled Water • DX pumped refrigerant with free cooling below 55 F • Heat exchangers

  15. Cooling • N+1 cooling capacity • Liebert DSe pumped refrigerant with free cooling • CFD Validation • Highest efficiency with hot aisle containment • Typical PUE = 2.0 Cold Supply Air Hot exhaust air Power to Racks Total D. C. Power PUE =

  16. Security • 24/7 Alarm active • An MCOLES certified PA330 Police Authority, co-located in the Data Center building • Dual authentication with Prox card and Biometrics • 24/7 monitored (…by third party) • CCTV mega pixel security cameras with remote viewing • Motion activated video record with 90 day retention minimum • Non-Clients/Vendors 100% escorted

  17. EPO - Emergency Power Off NEC Article 645 - B Disconnection Means (Emergency Power Off) Section 645.10 of the 2008 NEC requires that there be disconnecting means for each zone in the IT room. Section 645.10 of the 2011 NEC has two alternatives for the disconnecting means, (A) covers remote disconnect controls with requirements the same as the 2008 NEC and (B) covers critical operations data systems. Critical operations data systems (defined in 645.2) are permitted to have alternate disconnecting means provided that five additional conditions are met: An approved shut down procedure has been established Qualified personnel are continuously available 24/7 (3) Smoke sensors are in place. A fire suppression system is in place. Plenum cables are used for signaling • Building vs. Room • UPS vs. Feeder • General vs. First responder activation • Fire Suppression Activation • Code requirements…Equipment servicing room • CRAC’s and IT equipment vs. CRAC’s only • EPO • First responder only • Equipment servicing room – CRAC’s – Agent effectiveness • IT Equipment power – optional • Power for lighting & utility outlets

  18. Fire Suppression • New evaporative particulate • Inert gas FM200/ECARO – dual detector 165 • Dry pipe – Dual action - 185 • 2 detector active to charge lines • Pellet melt water zone • First Responder Training

  19. Monitoring • Preventative vs. Reactive • How much – Granular view vs. Sensory Overload • Methods & Protocols • SNMP • BACnet • Mod bus • Dry Contact • Alerting • email • text • phone call • audible alarms • Response Policy • Infrastructure HW vs. Network

  20. Monitoring Considerations • Critical to Uptime and PUE • Hand in Hand with Redundancy • SNMP • BACnet • Modules • BMS • DCIM Transfer Switch Facility Power Meter Power to Racks Total D. C. Power PUE =

  21. Management Control Points • Policy Compliance – SSAE16 SOC2, HIPAA • ~100 Control policies with Quality Control Repository • Operations guide • Risk Analysis & Mitigation Plan xx points • Disaster Recovery Plan – First Responder Guide • Employee handbook • DCIM & Asset management • Incident management & Ticketing System times

  22. Maintenance • Preventative • Service-effecting or Non-service-effecting • Notification of Clients (2-3 weeks in advance) • Network and compute redundancy and DR • CRAC’s & Condensers • Primary Transformer • Generator – Switchgear • UPS – Wrap around maintenance bypass • Breakers (ARC Flash) & Coordination • Fire Suppression & EPO • Transfer switches & Control logic

  23. Data Center Footprint

  24. Data Center Academic Programs • Academic Program focus: • Data Center Design • Operation Management • Continuing Education Seminars • Teaching lab in data center for hands-on learning • Lab sponsorships being sought from EMC, CISCO, HP, Dell, etc. • Focus on latest offerings/technologies

  25. Data Center Design Benefits • A SSAE16 SOC2, HIPAA, PCI compliant facility • Superior Infrastructure • Superior Redundancy • Superior Power • Security • Expertise in Commercial Data Center Design and Management

  26. Data Center Networking Redundancy Add Vblock schematic here MDF = Main Distribution Facility LGX = Light Guide Cross-connect – Carriers BGP HSRP

  27. DTE Substation Supplying Data Center

  28. Data Center

  29. Caging • Purpose vs. Necessity • Access control – individually locked racks • Layers of security • Styles • None • traditional • wrap • Individually locked racks • impact on floor plan

  30. Rack Management • AUP – Acceptable Use Policy • Velcro vs Zip tie • Cable labeling • Blanking plates • Fan direction

  31. Network & Cloud • A virtualized MDF Infrastructure • Carrier Neutral • BGP Carrier Redundancy • Edge Router Redundancy • Multi-tenant Cloud Services • Multiple Carriers Services Delivery Modes • Ready for BUaaS, BCaaS, IaaS, etc.

  32. Data Center Provides • Colocation services focused on power, space, bandwidth, and physical security • Flexibility – Independently variable levels of space, power, bandwidth, cooling, etc. • Environmental controls • Security • Carrier neutral/multiple carriers

More Related