1 / 25

CERN Computer Centre Consolidation Project

CERN Computer Centre Consolidation Project. Vincent Doré IT Technical Forum - 10 th June 2011. Overview. Current configuration of the CC (excluding B613) Current issues Consolidation project Goals Technical choices Plans and views New IT room Main dates

nasnan
Download Presentation

CERN Computer Centre Consolidation Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CERN Computer Centre Consolidation Project Vincent Doré IT Technical Forum - 10th June 2011

  2. Overview • Current configuration of the CC (excluding B613) • Current issues • Consolidation project • Goals • Technical choices • Plans and views • New IT room • Main dates • On-going work outside the project • Questions Computer Center Consolidation – VD - 100611 - 2

  3. Current configuration - 1 • Designed and built in early 1970s • Fully refurbished around 2000 • 2 types of equipment both on UPS: • Physics (single power feed, no diesel backup) • Critical (dual power feed and diesel backup) • Power capacity: • Nominal current capacity: 2.5MW (including 240kW of critical power) • Extended (usable) capacity: 2.9MW (including 340kW of critical power) forfeiting redundancy for all UPS systems • 2 main IT rooms (R-050 -1400m² & Vault - 1200m²). Several dedicated labs (telecom) Computer Center Consolidation – VD - 100611 - 3

  4. Current configuration - 2 • 3 physics UPS, each with 4*400kvA modules • 1 critical UPS with 2*300kVA modules • Air cooling installations: for IT equipment including telecoms (capacity up to ~2.5MW), technical areas... and offices • (3+1) Chillers not secured by UPS nor diesel (water at ~5C) • Pumps and ventilation units secured by diesel • Water-tank for stored cooling capacity • Water cooling installations: for IT physics equipment (capacity up to 800kW) • (1+1) Chillers not secured by UPS nor diesel (water at ~14C) • Pumps secured by (physics) UPS • 80 water-cooled racks secured by physics UPS • Water-tank for stored cooling capacity Computer Center Consolidation – VD - 100611 - 4

  5. Current issues - 1 • Cooling: • Insufficient cooling for critical UPS room • Impossible to improve the cooling in critical UPS room • CC not cooled when running on UPS without diesel • Insufficient cooling when running on diesel • IT critical equipment running • Pumps & Ventilation Units running • No chiller and insufficient stored cold water • Offices in B31 cooled by the CC Computer Center Consolidation – VD - 100611 - 5

  6. Current issues - 2 • Power • Insufficient critical power available • No redundancy for critical UPS (> 240 kW) • No redundancy for physics UPS (> 2.7 MW) • Other • Limited fire protection in B513 • Critical areas and physics areas strongly coupled • Share the same locations, cooling infrastructure and fire risks) • Improving critical infrastructure implies improving physics one... at an expensive price Computer Center Consolidation – VD - 100611 - 6

  7. Consolidation Project • Main goals • Secure the cooling and fire-protection of Critical IT equipment... Which should be able to run as long as power is available (for ever on diesel) • Increase the available critical power for IT equipment • Restore N+1 redundancy for both critical and physics UPS • Decouple the cooling of B31 from the CC • Main points on the wish list • Increase the physics power (if cost-effective) • Secure the cooling of physics equipment  Computer Center Consolidation – VD - 100611 - 7

  8. Technical choices - 1 • Upgrade of the electrical infrastructure • Total capacity 3.5 MW of IT equipment including 600kW of critical equipment • N+1 redundancy for critical and physics UPS • Separation of the critical and physics infrastructure • 3 critical areas : • new IT room (450kW) • extended telecom labs (50kW) • legacy network area in R-050 (100kW)  • New dedicated cooling installation for the critical areas (dual power feed, diesel) Computer Center Consolidation – VD - 100611 - 8

  9. Technicalchoices- 2 • Construction of a new partially sunken building dedicated to the cooling of the critical areas • (2+1) new chillers • 1 new storage tank • Pumps • New ventilation system for the critical UPS room • Conversion of the Barn area to house • 2 Electrical rooms (batteries, UPS and switchboards) • An IT room with water cooled racks for IT critical equipment (up to 480kW) • New ventilation system for the legacy critical network area in R-050 and for new electrical rooms Computer Center Consolidation – VD - 100611 - 9

  10. Technical choices - 3 • Modification of the telecom area • Extension into adjacent labs • Fire protected walls • New ventilation systems • Modification of the technical rooms (basement) • Move of the critical UPS (from S-435 to S-437) • New cooling control (in S-400, S-401 and S-402) • Addition of 2 new electrical transformers (adjacent to the existing ones) • Addition of 1 new water tank for physics equipment (adjacent to the existing one) Computer Center Consolidation – VD - 100611 - 10

  11. 3D view from the car park Computer Center Consolidation – VD - 100611 - 11

  12. 3D view of the ground floor Computer Center Consolidation – VD - 100611 - 12

  13. Ventilation in the Barn Computer Center Consolidation – VD - 100611 - 13

  14. Telecom critical area Computer Center Consolidation – VD - 100611 - 14

  15. 3D view of the basement Computer Center Consolidation – VD - 100611 - 15

  16. The new IT room • Critical IT equipment only (450kW, room designed for 480kW) • Cooling of IT equipment still being discussed, most likely water-cooled racks with passive heat exchangers • No power consumption • Cheap (compared to other cooling solutions) • No fan problems • No cold/hot corridors, cold room • Cooling issues for heterogeneous equipment??? • Very likely 90 racks (70cm*110cm*47u), average power density 5kW/rack Computer Center Consolidation – VD - 100611 - 16

  17. Views of a passive rack Computer Center Consolidation – VD - 100611 - 17

  18. Plan of the new IT room (draft) Computer Center Consolidation – VD - 100611 - 18

  19. Important dates • Autumn 2009 : serious discussions with EN • May 2010 : approval of the project • October 2010 : barn ready for dismantling • January 2011 : preliminary electrical works • May 2011 : start of the main civil engineering works • Summer 2011 : MS and IT for the water-cooled racks • November 2011 : end of the main civil engineering works • November 2011 : start of the main cooling works • November 2011 : start of the main electrical works • May 2012 : start of the works for the additional water tank • July 2012 : end of the main cooling works • July 2012 : end of the works for the additional water tank • August 2012 : end of the main electrical works • October 2012 : end of the works Computer Center Consolidation – VD - 100611 - 19

  20. Work outside the project - 1 • Problematic cooling of the Physics equipment • When running on UPS without diesel power : IT physics servers up... without cooling • No easy solution. We need power (500kW) for the pumps and ventilation in R-050 and the Vault (>500kW) • No agreement reached when the consolidation project was approved • However... on-going discussions/investigations and a good chance to power the ventilation units from the UPS Computer Center Consolidation – VD - 100611 - 20

  21. Work outside the project - 2 • Improvement of the cooling efficiency • To be able to cool more IT equipment and to save on cooling power • Main steps already achieved • Cold corridors (from 2008) • Temperature increase of blown air in R-050 from 14C to 16C (September 2010) • (More) use of outside fresh air (February 2011, net immediate gain 200kW) • Blown air in R-050 from 16C to 18C (2 May 2011) • Blown air in R-050 from 18C to 19C (11 May 2011) Computer Center Consolidation – VD - 100611 - 21

  22. Work outside the project - 3 • Improvement of the cooling efficiency • Future steps • Continue to increase the temperature of the air blown in R-050 (up to 22? While ensuring air < or = 27C for all IT equipment) • Increase the temperature of the Vault and possibly of some UPS rooms. Discussion starting with EN/CV • Better understand and improve the air flow from the main ventilation units to the servers in R-050 • Continue to improve cold air confinement where relevant • Increase the temperature of the chilled water? Computer Center Consolidation – VD - 100611 - 22

  23. Work outside the project - 4 • Improvement of the monitoring of the infrastructure of the Computer Centre • Important work already done to better understand the power flows in the CC and store/display them in Lemon. .. Still some work to do, in particular on the alarms • Work to be done to include temperatures and related alarms in Lemon • Progressive implementation of intelligent power bars in IT racks (end-2011 installations & new IT room) Computer Center Consolidation – VD - 100611 - 23

  24. Video on the outside work • To upload Computer Center Consolidation – VD - 100611 - 24

  25. Questions? See also http://cern.ch/it/cc2012 Computer Center Consolidation – VD - 100611 - 25

More Related