1 / 38

Collaborating at a Distance: Operations Centers, Tools, and Trends Erik Gottschalk Fermilab

Collaborating at a Distance: Operations Centers, Tools, and Trends Erik Gottschalk Fermilab. Overview. Operations centers for the LHC and experiments LHC@FNAL Remote Operations Center (ROC) CMS Centres Worldwide Innovative ideas for remote operations Collaborative tools

Download Presentation

Collaborating at a Distance: Operations Centers, Tools, and Trends Erik Gottschalk Fermilab

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collaborating at a Distance: • Operations Centers, Tools, and Trends • Erik Gottschalk • Fermilab

  2. Overview • Operations centers for the LHC and experiments • LHC@FNAL Remote Operations Center (ROC) • CMS Centres Worldwide • Innovative ideas for remote operations • Collaborative tools • Collaborative tools used for operations today • Trends • Crystal ball: future operations capabilities

  3. Introduction • With the growth of large international collaborations in High Energy Physics (HEP), there has been a growing need to develop capabilities for remote participation in daily operations at the LHC. • Remote monitoring & diagnosing of experiments is nothing new. This has been done for more than 10 years. • Remote operations is the next step, to enable collaborators to participate in daily operations from anywhere in the world. The most important aspects are: • secure access to data, devices, logbooks, monitoring & diagnostic information; • safeguards, so that actions do not jeopardize or interfere with operations; • collaborative tools for communication and participation in shift activities; • remote shifts for more efficient and effective operations.

  4. Original LHC@FNAL Concept (2005) • A Place • that provides access to information in a manner similar to what is available in control rooms at CERN, • where members of the LHC community can participate in accelerator and experiment activities. • A Communications Conduit • between CERN and members of the LHC community located in North America. • An Outreach Venue • where visitors can see current LHC activities, and • see how future international projects in HEP can benefit from remote participation and remote operations capabilities.

  5. LHC@FNAL Location and Layout CMS Centre (LHC@FNAL)

  6. Evolution of the LHC@FNAL Concept • How did the concept for the LHC@FNAL evolve? • Fermilab is a Tier-1 grid computing center for CMS, • has contributed to CMS detector construction, • hosts the LHC Physics Center (LPC) for US-CMS, • has built LHC machine components, • is part of the LHC Accelerator Research Program (LARP). • The LPC was planning to do remote data quality monitoring for CMS. We asked if this could be expanded to include remote shifts. • LARP was planning to provide support for US-built accelerator components and to train people before going to CERN. We asked if this could include remote participation in LHC accelerator studies. • We saw an opportunity for CMS detector experts to work together with accelerator scientists & engineers to contribute their combined expertise to the LHC and CMS. • The concept for a joint LHC@FNAL remote operations center emerged.

  7. Development of LHC@FNAL • In May 2005 we formed a task force that included members from every Fermilab Division, CMS, LARP, CERN-AB, and the University of Maryland. • The task force had an advisory board that included members from ATLAS, BNL, CERN-AB, CERN-PH, CERN-TS, and several universities. • The task force developed a plan by consulting with people from CMS, LHC, CDF, D0, MINOS, MiniBoone and Fusion Energy Sciences. • We worked with CMS and US-CMS management, as well as LARP and LHC machine groups at all steps in the process. • We prepared a requirements document for LHC@FNAL. • We visited 9 sites (e.g. Hubble Space Telescope, SNS, General Atomics, ESOC) to find out how other projects build control rooms & do remote operations. • We prepared a Work Breakdown Structure with budget estimates, and received funding from the Fermilab Director in 2006. • We completed construction of LHC@FNAL in February 2007. • LHC@FNAL is now part of CMS Centres Worldwide and is used for CMS detector commissioning, data operations, data quality monitoring, trigger monitoring, Fermilab Tier-1 monitoring, special events, and tour groups.

  8. Timeline Mar. 24, 2006 Acknowledgements: CMS concept: Dan Green LHC concept: Alvin Tollestrup Global Accelerator Network: Albrecht Wagner

  9. Timeline Mar. 24, 2006 Sep. 13, 2006 LHC@FNAL start of construction

  10. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 LHC@FNAL construction completed

  11. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 CMS Centre at CERN: Lucas Taylor

  12. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 LHC@FNAL Dedication (HD video link to CERN)

  13. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 Established permanent video link between LHC@FNAL & CMS Centre at CERN

  14. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 May 28, 2008 CMS Centres Worldwide: Lucas Taylor

  15. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 May 28, 2008 Sep. 10, 2008 LHC First Beam Day (Fermilab Pajama Party)

  16. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 May 28, 2008 Sep. 10, 2008 Oct. 3, 2008 LHC Grid Fest

  17. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 May 28, 2008 Sep. 10, 2008 Oct. 3, 2008 Oct. 16, 2008 CMS Centre at DESY: Guenter Eckerlin

  18. Timeline Mar. 24, 2006 Sep. 13, 2006 Feb. 14, 2007 Mar. 16, 2007 Oct. 22, 2007 Feb. 6, 2008 May 28, 2008 Sep. 10, 2008 Oct. 3, 2008 Oct. 16, 2008 Jan. 22, 2009 CMS Centre at UERJ (Brazil): Alberto Santoro

  19. CMS Centres Worldwide @ DESY @ Aachen @ Zurich @ CERN @ Oviedo @ Dubna @ Pisa @ Fermilab (LHC @ FNAL) @ Beijing @ Adana @ Delhi @ Mumbai @ Rio de Janeiro @ Sao Paulo @ Canterbury See Lucas Taylor’s talk in CHEP 2009 Collaborative Tools session: “CMS Centres Worldwide: A New Collaborative Infrastructure”

  20. Innovations • We have introduced several significant innovations that contribute to HEP remote operations capabilities. These innovations enhance security, provide safeguards, and contribute to collaborative tools that are available for use in operations. • Group accounts with user-specific login and persistent sessions (secure access) • Users log in to a specific console (ECAL, tracker, trigger, etc.) using their own credentials • Users can transfer control to other users in the same group without ending a login session • Role Based Access Control (RBAC) for LHC controls (secure access, safeguards) • FNAL/CERN collaboration to implement role-based access for the LHC controls system • RBAC is now used for ALL access to LHC accelerator controls at CERN • Screen Snapshot Service (secure access, collaborative tool) • Share screen content without security risks of tools such as VNC and NoMachine NX • Ci2i (collaborative tool, user/resource/content management system) • CMS Centre users anywhere in the world can observe displays in other CMS Centres • CMS-TV: web displays for outreach in public places • Lucas Taylor,Tuesday 080: “Ci2i & CMS-TV: Generic Web Tools for CMS Centres” • High Definition video conferencing for operations (collaborative tool) • Permanent video links between CMS Centres for spontaneous & informal communication • Erik Gottschalk: “High Definition Video Conferencing for High Energy Physics”

  21. Other Innovative Solutions in HEP • There are other innovative solutions in HEP. For example: • ATLAS • Data processing: central and remote shifts for processing real and simulated data • Analysis support: central shifts with Sharepoint sites & regional analysis support structures • CALICE (CAlorimeter for the LInear Collider Experiment) test beam • Remote control of testbeam equipment using Sun Secure Global Desktop software • 24/7 video conferencing connection between FNAL test beam & 2nd control room at DESY • Remote controlled webcams are used to observe electronics and computer screens at FNAL • CDF (Collider Detector at Fermilab) • Remote shifts, and video is used to link shift leaders to remote shift people • Screen Snapshot Service (http://www-cdfonline.fnal.gov/java/snapshot/ShowImageList.jsp) • GAN (Global Accelerator Network) • Concept for remote operation of accelerator facilities • ICFA Task Force Reports (http://www.fnal.gov/directorate/icfa/icfa_tforce_reports.html) • MINOS (Main Injector Neutrino Oscillation Search) at Fermilab • Fermilab network extended to Soudan Minnesota to improve network access (security) • Remote shifts for data-quality monitoring at Fermilab

  22. Innovative Solutions in Other Fields • European Space Operations Center (ESOC) • Voice-loop system used for audio communication (similar to NASA) • Analog video distribution system for shared displays • Gemini Observatory (2 sites: Hawaii & Chile) • Remote operation of the observatory. When “running” 2 people are at the Mauna Kea summit, but they are not expected to make critical decisions due to the high altitude. • Hardware VPN is used to link Gemini Hawaii to Gemini Chile • Video is much better for operations compared to telephone, email, & text-based tools • General Atomics (fusion energy research) • Webcams in control room let remote users see where someone is sitting near a phone • The Fusion Energy Science community has many other innovative solutions… • NIF (National Ignition Facility) • Internal network for NIF control, external network for monitoring and diagnosing • Access is provided to all data for monitoring, using Shareplex database replication • SNS (Spallation Neutron Source) • EPICS control system with access control restrictions

  23. Collaborative Tools for Operations • Collaborative tools we are currently • using, have considered using, orare evaluating for LHC and/orCMS remote operations. • There are many tools available! • Desirable qualities: • Reliable • Easy to use • Interoperable • Comprehensive

  24. Collaborative Tools for Operations • Collaborative tools we are currently • using, have considered using, orare evaluating for LHC and/orCMS remote operations. • There are many tools available! • Desirable qualities: • Reliable • Easy to use • Interoperable • Comprehensive • Tools that are usedin CMS Centres today.

  25. Collaborative Tools for Operations • Collaborative tools we are currently • using, have considered using, orare evaluating for LHC and/orCMS remote operations. • There are many tools available! • Desirable qualities: • Reliable • Easy to use • Interoperable • Comprehensive • Tools that are usedin CMS Centres today.

  26. Collaborative Tools for Operations • Collaborative tools we are currently • using, have considered using, orare evaluating for LHC and/orCMS remote operations. • There are many tools available! • Desirable qualities: • Reliable • Easy to use • Interoperable • Comprehensive • Tools that are usedin CMS Centres today.

  27. Collaborative Tools Used in CMS Centres • Audio & Video • There are numerous choices for audio & video communications, but the only tools we tend to use are telephone, teleconferencing, H.323 videoconferencing and EVO. • Video is much better for communication in an operations environment compared to audio-only tools, or reliance on email and other text-based tools alone. • File Sharing & Text Messaging • There are many tools for exchanging and recording textual information. • The emphasis should be on selecting specific tools (ie. not introducing new ones), integrating existing tools and establishing a more homogeneous work environment. • Remote Access & Desktop Viewing or Sharing • We rely heavily on web-based monitoring tools. The emphasis should be on making as much information as possible available on the web in a secure manner. • Information that is not easily accessible invariably introduces remote access and security concerns, or discourages people from collaborating in the first place. • * EVO is the only tool used in CMS Centres that addresses all aspects of communication in a single tool.

  28. EVO Infrastructure for Collaboration EVO • 18,900 registered users • 600 users per day • 160 meetings per day Desktops H.323 MCU Phones SIP Handheld OpenGL Video Monitoring / MonALISA Whiteboard / Shared Files Record / Playback IM / Chat

  29. Trends in HEP • Audio & Video • Industry trend is to improve audio & video quality and reduce cost of systems (e.g. HD videoconferencing systems are replacing standard definition videoconferencing systems). • HEP operations trend is towards “telepresence” so that collaborating at a distance becomes more and more like co-located operations teams in a control room. • File Sharing & Text Messaging • More collaborative tools are introduced, with individual groups using their favorite tool. Selecting specific tools or standards would be a big help. • Remote Access & Desktop Viewing or Sharing • More and more reliance on tools for web-based monitoring and diagnosing. • The trend is towards more security, thereby making the use of collaborative tools more difficult.

  30. Crystal Ball (2, 4, 6, 8, and 10 years) • Future operations capabilities that might be implemented with technologies that are available today, or almost available today. • 2 years • Telepresence for operations • 90% of CMS institutions will have CMS Centres • 4 years • Networks for collaboration • 6 years • Collaborative high resolution video walls • 8 years • Role-aware, presence-aware communication • 10 years • Immersive co-location of globally distributed operations teams

  31. Crystal Ball (2 years) • Telepresence systems will be used for operations. • 90% of CMS institutions will have their own CMS Centre.

  32. Crystal Ball (4 years) • Integrated networks that implement secure tunneling technologies (which are available today) will encourage more collaboration in HEP on a global scale. Current approaches to networking and security discourage people from collaborating when they can be making valuable contributions.

  33. Crystal Ball (6 years) • Collaborative high resolution video walls mirrored at multiple sites.

  34. Crystal Ball (8 years) • We will have role-aware and presence-aware communications capabilities for HEP operations. • Role-aware: so that a collaborator can talk to a person who can help with a specific task, not necessarily a specific individual. • Presence-aware: so that presence information is integrated and generated automatically. For example, a person waiting in line for lunch might be reachable by phone or text messaging, but not by email or video.

  35. Crystal Ball (10 years) • Virtual reality environments for HEP operations. These environments will integrate high resolution shared displays, secure collaborative networks, and telepresence for immersive co-location of globally distributed operations teams.

  36. Thanks • I wish to thank several people for contributions and information: • Paul Van Arsdall (NIF) • Wayne Baisley (photography) • Toby Clark (ESA) • Philippe Galvez (EVO) • Mario Giannella (SNS) • Reidar Hahn (photography) • Sven Karstensen (CALICE) • James Kennedy (Gemini Observatory) • Kaori Maeshima (CDF) • David Schissel (General Atomics) • Lucas Taylor (CMS Centres & Ci2i) • William Trischuk (ATLAS) • Karsten Weber (ESA)

  37. Other talks/posters at CHEP 2009 • Collaborative Tools session (Monday and Tuesday) • Lucas Taylor et al., “CMS Centres Worldwide: A New Collaborative Infrastructure” • Philippe Galvez, “EVO (Enabling Virtual Organizations)” • Erik Gottschalk et al., “High Definition Videoconferencing for High Energy Physics” • Steven Goldfarb, “Collaborative Tools and the LHC: Some Success, Some Plans” • Distributed Processing and Analysis session (Tuesday and Thursday) • Lassi Tuura et al., “CMS Data Quality Monitoring: Systems and Experience” • David Mason, “Remote Operation of the Global CMS Data and Workflows” • Poster session (Monday) • Lassi Tuura et al., “CMS Data Quality Monitoring Web Service” (Board #071) • Gordon Watts, “DeepConference: A Complete Conference in a Picture” (Board #086) • Poster session (Tuesday) • Joao Fernandes et al., “Virtuality and Efficiency - Overcoming Past Antinomy in the Remote Collaboration” (Board # 077) • Lucas Taylor et al., “Ci2i and CMS-TV: Generic Web Tools for CMS Centres” (Board #080)

More Related