1 / 9

UM/MITC Data Center

UM/MITC Data Center. Shared Data Center. MITC building built and owned by MAVDevelopment, opened in January 2005 10,000 SF data center shell on lower level Capable of supporting a Tier III installation U-M 85% MITC (Internet2/Merit) 15% Main Electrical Room MDF – network ISP meet-me space

nitsa
Download Presentation

UM/MITC Data Center

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UM/MITC Data Center University of Michigan, CSG, May 2006

  2. Shared Data Center • MITC building built and owned by MAVDevelopment, opened in January 2005 • 10,000 SF data center shell on lower level • Capable of supporting a Tier III installation • U-M 85% • MITC (Internet2/Merit) 15% • Main Electrical Room • MDF – network ISP meet-me space • Prep and storage space near loading dock University of Michigan, CSG, May 2006

  3. 1 2 Multi-purpose Facility • East half of room is “server class” space • Up to 160 W / SF power / cooling load • 78 U-M racks (70 for servers, 8 for network) • 52 MITC racks (48 for servers, 4 for network) • West half of room is “HPC class” space • Up to 240 W / SF power / cooling load • 194 U-M racks (182 for servers, 12 for network) 1 University of Michigan, CSG, May 2006

  4. Data Center Floor Plan University of Michigan, CSG, May 2006

  5. Data Center Electrical Systems • 4 megawatts of power • 2 MW for equipment • 2 MW for cooling, lighting, etc. • Flywheels • 2MW for equipment • Condition power from DTE • Transition to generators in a power outage • Generators • Three 2 MW generators • Need two running to support full load University of Michigan, CSG, May 2006

  6. Data Center Cooling • 16 traditional 30-ton A/C units • Cooling for 120 W / SF load throughout the room • Blow cold air up through the raised floor • Practical limit to this type of cooling in this space • 26 8-ton XDO units • Cooling for 80 W / SF load, west half only • Blow cold air down from the ceiling • 52 4-ton XDV units • Cooling for another 40 W / SF load throughout • Somewhat flexible installation across room • XDO/XDV units use phase-change refrigerant, not glycol; gaseous at room temperature University of Michigan, CSG, May 2006

  7. Additional Cooling • Three 30-ton A/C units for the Main Electrical Room • Two 20-ton A/C units in the MDF • Eight dry coolers on the roof • Redundant pumps to circulate glycol through the cooling loop to all of the A/C units and the dry coolers University of Michigan, CSG, May 2006

  8. Networking • Building has 4 duct banks that enter on the 4 corners of the property • U-M has fiber connectivity, will install additional fiber on second path • MITC Foundation has fiber to campus and Ann Arbor meet-me at Hands-On Museum • Total ~400 fibers on three paths • Core/distribution equipment in MDF • Overhead cable tray provides • 12/12 composite fiber from MDF to each row of racks • Copper distribution from rack to rack University of Michigan, CSG, May 2006

  9. Current Activities • Build-out of Data Center has begun • Operating Agreement being developed • Operating Budget being developed University of Michigan, CSG, May 2006

More Related