1 / 16

GridPP2: Application Requirement & Developments Nick Brook University of Bristol

GridPP2: Application Requirement & Developments Nick Brook University of Bristol. ALICE Hardware Projections Applications Programme. Hardware. LHC experimental numbers based on ongoing re-assessment exercise Computing Technical Design Reports due in 2005 Expts will be using LCG “system”

Download Presentation

GridPP2: Application Requirement & Developments Nick Brook University of Bristol

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GridPP2: Application Requirement & Developments Nick Brook University of Bristol • ALICE • Hardware Projections • Applications Programme Oxford eSc – 1st July’03

  2. Hardware • LHC experimental numbers based on ongoing re-assessment exercise • Computing Technical Design Reports due in 2005 • Expts will be using LCG “system” • Hardware and chosen middleware • Software tools – e.g. POOL for persistency • Numbers include Tier-2 needs • Non LHC experiments also gave estimated forward look • Based on MC production & analysis • Expts are expecting a single, integrated Tier-1 centre Oxford eSc – 1st July’03

  3. Hardware • Expts are expecting a single, integrated Tier-1 centre • Short term LHC expts expect some form of centralised planning via LCG project • Projection Execution Board  Grid Deployment Board • GridPP participation in LCG bodies • GridPP will continue with annual h/w review • CPU vs Disk Oxford eSc – 1st July’03

  4. Ongoing Activities Example: LHCb Data Challenge – >40M events – 170 yrs on a 1GHz PC ~1/3 events produced in the UK Oxford eSc – 1st July’03

  5. Ongoing Activities Current usage of Tier-1/A centre dominated by BaBar usage – 60% of CPU, 90% of disk Oxford eSc – 1st July’03

  6. Networking • Bandwidth dominated by replication in analysis stage of data processing • Use of tools, such as OptorSim, to understand networking • LHC expts need to understand computing & analysis models • Early estimates factor of 5 increase • Current problems with MC production & bulk transfer • Unrelated to SuperJANET • Often attributable to links into the MAN Oxford eSc – 1st July’03

  7. CPU estimates • CPU resource reqts are equivalent to 14k 2.4GHZ dual processors running continuously • LHC expts: ~65% need in 2004  >80% in 2007 Oxford eSc – 1st July’03

  8. Disk Requirements • 60% of disk reqts in 2004 for LHC expts • 70% of disk storage in 2007 for LHC expts • Non-LHC expts still data taking – need disk for finishing analyses Oxford eSc – 1st July’03

  9. Tape Requirments • Tape usage completely dominated by LHC usage – 90% • Large level uncertainty • 2007: ATLAS ( 850TB) vs CMS (1150TB) Oxford eSc – 1st July’03

  10. Needs in 2004 Oxford eSc – 1st July’03

  11. Application Development • Building on current collaborative activity • GANGA: ATLAS & LHCb • SAM: CDF & DØ • BaBar: adoption of EDG s/w • Prototyping  Production environment • Monte Carlo production activity  Analysis environment • Grid technologies becoming more widely accepted across HEP commuity • “old” experiments – UKDMC, ZEUS, … • “new” activities – LCFI, MICE, … Oxford eSc – 1st July’03

  12. Application Development • Similar pattern of needs emerge from all experiments (not too suprisingly!) • Storage & location of data • Replication issues • Monte Carlo production tools • Seen as an obvious area for “efficiency” savings • Analysis interfaces • Intelligent bookkeeping of a user’s analysis activities • Persistency solutions • Composite objects spread across several storage systems Oxford eSc – 1st July’03

  13. New Experiments Oxford eSc – 1st July’03

  14. LHC experiments Oxford eSc – 1st July’03

  15. Non LHC expts Oxford eSc – 1st July’03

  16. Application Call • Essential continue to develop application interface through GridPP2 • Expand activity to allow current none GridPP supported expts to participate • Benefit from LCG developments • Call for application posts – January’04 • Response by April • Reviewed à la GridPP (“Williams” committee) • Expt activity in UK • Science o/p • Track record in Grid activity Oxford eSc – 1st July’03

More Related