Laserhut
This presentation is the property of its rightful owner.
Sponsored Links
1 / 3

Laserhut PowerPoint PPT Presentation


  • 52 Views
  • Uploaded on
  • Presentation posted in: General

1. 10. 4. 1. 2. 1. 3. 1. 2. 5. 10. 10. 10. 5. 1. 4. 4. 4. 1. 4. 1. 4. 2. 2. 1. 2. 1. Laserhut. 1. 5. 4. 6. 2. 2. 2. 2. Cooling. 3. 2. 3. 1. 4. 10. 3. 2. 3. 1. 5. 6. 4. 2. 4. 4. 4. 4. 2. 1. 10. 1. 1. 1. 5. 10. 10. 10. 6. 1.

Download Presentation

Laserhut

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Laserhut

1

10

4

1

2

1

3

1

2

5

10

10

10

5

1

4

4

4

1

4

1

4

2

2

1

2

1

Laserhut

1

5

4

6

2

2

2

2

Cooling

3

2

3

1

4

10

3

2

3

1

5

6

4

2

4

4

4

4

2

1

10

1

1

1

5

10

10

10

6

1

DCS network in UX

  • Fixed installations -- [power supplies, VME crates etc.]

    • 248 ports, 100MBit, only in racks

    • Based on: rack layout, DCDB, individual requests

  • Fixed installations -- [“ALICE” switches for DCS boards, RCU]

    • 15 ‘uplinks’ = One 1-GBit uplink per group of switches (rack)

  • Additional network -- [e.g. for commissioning/debugging]

    • Wireless (enough to cover whole cavern)

    • ~50 ports ‘strategically distributed across rack areas’

N


Laserhut

1

10

4

1

2

1

3

1

2

5

10

10

10

5

1

4

4

4

1

4

1

4

2

2

1

2

1

Laserhut

1

5

4

6

2

2

2

2

Cooling

3

2

3

1

4

10

3

2

3

1

5

6

4

2

4

4

4

4

2

1

10

1

1

1

5

10

10

10

6

1

  • All connected to IT/CS installed and maintained switches

    • Located in accessibe areas

  • = Concentration of ports for a single detector [TRD, MCH, CTP]

    • Direct cabling could be replaced by links to “Alice” switches

      • Cost saving on cabling and IT/CS switches

      • Extra cost for ‘uplinks’ and “Alice” switches

      • Net cost saving ~25 KCHF


Laserhut

  • These are savings on installation costs

  • Other aspects to be considered:

    • Maintenance - These “Alice” switches need to be operated, maintained, etc. by Alice

      • An effort for the lifetime of ALICE

    • Operation – A failing switch will have very serious impact on the operation of ALICE

      • Note these switches will operate in (low) magnetic field, with unknown ‘long term reliability’, and are in a not-accessible area[TRD tests proved they run in magnetic field, but test was only few hours]

      • A failing switch [for TRD, MCH] will leave a large part of the LV running ‘uncontrolled’.Hardwired interlocks or switching off a rack are the only means of ‘control’

      • A failing switch [for CTP] will leave the whole CTP without network connection

      • Access to UX needs to be granted by LHC to replace a faulty switchAt most only in between fills? → Considerable loss of physics time

  • We propose to adopt the safer solution of individually connected devices on a IT/CS maintained network


  • Login