Lhcb planning
This presentation is the property of its rightful owner.
Sponsored Links
1 / 13

LHCb Planning PowerPoint PPT Presentation


  • 79 Views
  • Uploaded on
  • Presentation posted in: General

LHCb Planning. Pete Clarke (Uni. Edinburgh) Stefan Roiser (CERN, IT/ES). SHORT-TERM PLANNING. Operations. Planning until Summer Incremental Stripping to be started in April Will be limited by the performance of the tape systems Reminder on needed bandwidth for tape recall (MB/s)

Download Presentation

LHCb Planning

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Lhcb planning

LHCb Planning

Pete Clarke (Uni. Edinburgh)

Stefan Roiser (CERN, IT/ES)


Short term planning

SHORT-TERM PLANNING

WLCG Ops Planning


Operations

Operations

  • Planning until Summer

    • Incremental Stripping to be started in April

      • Will be limited by the performance of the tape systems

      • Reminder on needed bandwidth for tape recall (MB/s)

      • Operation will last for 8 weeks

      • Next incremental stripping planned for fall ’13

    • Otherwise mainly Monte Carlo and User activities

    • CERN CASTOR to EOS migration close to be finished

WLCG Ops Planning


Mid term planning

(preview on currently ongoing discussions)

Mid-term planning

WLCG Ops Planning


Cvmfs deployment

CVMFS deployment

  • LHCb sticks to the target deployment day 30 April 2013

    • No more software updates after that day at the “old shared software areas”

  • Usage of the dedicated mount point for our “conditions DB”currently under discussion

    • To be used in production after LS1

    • Structuring of the online conditions also under discussion, will have impact on usage

WLCG Ops Planning


Tighter integration with t2 sites

Tighter integration with T2 sites

  • Currently ongoing discussion on how to integrate some T2 sites into more workflows

    • Minimum requirements will be published

      • E.g. X TB of disk space, Y number of WNs, etc.

      • Those sites will be able to run e.g. also analysis jobs from local disk storage elements

    • Better monitoring and “performance measurements” of those sites will be needed

      • Publish LHCb measurements into IT monitoring (SUM, Dashboard)

WLCG Ops Planning


Fts3 i ntegration and d eployment

FTS3 integration and deployment

  • Currently ongoing discussions about needed features for the experiment and their implementation

    • E.g. Bringonline, number of retrials,

  • Test instance with all needed functionality close to deployment

WLCG Ops Planning


Federated storage

Federated storage

  • Federated storage usage will be implemented

  • Decision on technology (xroot, http) not yet taken but shall be only one of them

    • Idea to use fallback onto other storage elements only as exception

WLCG Ops Planning


Wlcg i nformation system

WLCG Information System

  • Very good abstraction layer to underlying information systems

    • Can replace several e.g. BDII queries currently implemented within DIRAC (CE discovery, …)

WLCG Ops Planning


Monitoring

Monitoring

  • LHCb will feed its monitoring information into the IT provided infrastructure (e.g. SAM/SUM)

  • Better monitoring and ranking of T2 sites will be needed

    • Thresholds to be introduced

  • Better information to be provided to sites to find out about LHCb’s view on them

    • Eg. “why is my site currently not used”

    • Will be also provided through the Dirac web portal

WLCG Ops Planning


Other efforts to keep an e ye o n

Other efforts to keep an eye on

  • perfSonar

    • Will be helpful for network monitoring, especially in view of T2 integration into more workflows

  • SL6 deployment

    • LHCb software pretty well decoupled from WN installation, no major problem foreseen

      • First full slc6 based software stack to be released soon

  • glExec

    • Is being tested within LHCb grid software

WLCG Ops Planning


Internal reviews

Internal Reviews

  • LHCb is conducting 2 internal reviews

    • Of the fitness for purpose of the distributed computing system (based on DIRAC)

    • Of the Computing Model itself

  • Both due to report ~ mid 2013

WLCG Ops Planning


Conclusions

Conclusions

  • Major trends are available

    • Federated storage, T2 integration, more monitoring

  • A final planning and technical decisions will be available after the closing of the currently ongoing reviews of Dirac and the computing model

WLCG Ops Planning


  • Login