1 / 29

US ATLAS Project Management

US ATLAS Project Management. J. Shank DOE/NSF review of LHC Computing 8 July, 2003 NSF Headquarters. Outline/Charge. International ATLAS organization Org. Chart, Time Line, DC plans, LCG software integration US ATLAS organization Project management plan for the Research Program

halee-dean
Download Presentation

US ATLAS Project Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. US ATLAS Project Management J. Shank DOE/NSF review of LHC Computing 8 July, 2003 NSF Headquarters

  2. Outline/Charge • International ATLAS organization • Org. Chart, Time Line, DC plans, LCG software integration • US ATLAS organization • Project management plan for the Research Program • WBS and MS Project scheduling • Procedure for determining Computing/M&O budget split • FY03 Budget • FY04 Budget • Answers to Jan. review management issues J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  3. x x x x x x x x x x x x x x x Slide from D. Barberis. LHCC 1 July, 2003 x x New ATLAS Computing Organization J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  4. 2003 Jul 03 POOL/SEAL release Jul 03 ATLAS release 7 (with POOL persistency) Aug 03 LCG-1 deployment Dec 03 ATLAS complete Geant4 validation Mar 04 ATLAS release 8 Apr 04 DC2 Phase 1: simulation production Jun 04 DC2 Phase 2: reconstruction (the real challenge!) Jun 04 Combined test beams (barrel wedge) Dec 04 Computing Model paper Jul 05 ATLAS Computing TDR and LCG TDR Oct 05 DC3: produce data for PRR and test LCG-n Nov 05 Computing Memorandum of Understanding Jul 06 Physics Readiness Report Oct 06 Start commissioning run Jul 07 GO! NOW 2004 2005 2006 Slide from D. Barberis. LHCC 1 July, 2003 2007 ATLAS Computing Timeline J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  5. How to get there:1) Software • Software developments in progress: • Geant4 simulation validation for production • GeoModel (Detector Description) integration in simulation and reconstruction • Full implementation of new Event Data Model • Restructuring of trigger selection, reconstruction and analysis environment • POOL persistency • Interval of Validity service and Conditions DataBase • Detector response simulation in Athena • Pile-up in Athena (was in atlsim/G3) Slide from D. Barberis. LHCC 1 July, 2003 J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  6. How to get there:2) Data Challenges • DC1 (2002-2003) completed in April 2003: • 2nd pass of reconstruction with Trigger L1 and L2 algorithms for HLT TDR in progress • Zebra/Geant3 files will be converted to POOL format and used for large-scale persistency tests • they will be used as input for validation of new reconstruction environment • DC2 (1st half 2004): • provide data for Computing Model document (end 2004) • full use of Geant4, POOL and Conditions DB • simulation of full ATLAS and of 2004 combined test beam • prompt reconstruction of 2004 combined test beam • DC3 (2nd half 2005): • scale up computing infrastructure and complexity • provide data for Physics Readiness Report • Commissioning Run (from 2nd half 2006): • real operation! Slide from D. Barberis. LHCC 1 July, 2003 J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  7. LCG Applications Components • SEAL • Plug-in manager • Internal use by POOL now • Full integration into Athena Q3 2003 • Data Dictionary • Integrated into Athena now • Includes Python support • POOL • Integration underway • Goal is to have demonstrated support for POOL by 31 July • Ability to read and write components of the ATLAS EDM • Complete support by Oct 2003 • SEAL Maths Library • Integrate in time for DC-2 • PI • Integrate ROOT implementation of AIDA API Q3 2003 • SPI Software Project Infrastructure Slide from D. Barberis. LHCC 1 July, 2003 J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  8. US ATLAS Computing Organization Chart J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  9. US ATLAS Computing Management Plan • Existing document from Nov., 2001 • Includes Tier-2 selection process (timescale has slipped) • Being rewritten now to take into account new structure and Research Program • Main change: relative roles of Shank/Huth • In broad brush-strokes: • Shank: day-to-day management of the computing plan • Budget allocation for project funded people • Work plan for all computing activities • Huth: deals with issues broader than just US ATLAS • NSF Large ITR: DAWN • Grid projects: PPDG, GriPhyN, iVDGL • LCG (POB) • ICB (ATLAS International Computing Board) • This new organization with Shank/Huth is working well. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  10. US ALTAS Computing planning • Complete scrubbing of the WBS from January review is in progress. • Series of WBS scrubbing meetings culminating on 6/6/03 • Participants: Level 3 managers and above • Concentrated on project funded resources • This part is done and is reflected in talks today. • More work needed on base and other funded resources. • More work needed on integration with ATLAS planning • Working with new ATLAS planning officer. • ATLAS planning will be complete in Sept. manpower review J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  11. Facilities/GTS/Production MS Project J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  12. MS Project Facilities Milestones J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  13. Grid3/GTS Milestones J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  14. Software MS Project Milestones for ATLAS overall, LCG and U.S. ATLAS J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  15. Computing/M&O budget split • US Executive Board and US Level 2 managers advise the Project Manager(PM) on M&O/Computing split • Long standing US Management Contingency Steering Group from the construction project now becomes an advisory body to the PM for the Computing/M&O split • Members: • P. Jenni, T. Akesson, D. Barberis, H. Gordon, R. Leitner, J. Huth, L. Mapelli, G. Mikenberg, M. Nessi, M. Nordberg, H. Oberlack, J. Shank, J. Siegrist, K. Smith, S. Stapnes, W. Willis • Represents all ATLAS interests • Meets ~ quarterly • Unique body that has served ATLAS and US ATLAS well. • Decisions based on interleaved priorities, case-by-case. • US computing presently working with ATLAS computing to prepare “planning tables” as used in the construction project. • requires detailed resource loaded schedule RP profile J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  16. U.S. ATLAS Research Program J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  17. FY03 Commitments • Existing effort on Athena and data management • FY03: 12 FTEs $2,293k • Project management/coordination 2 FTE • Core services 3.75 FTE • Program flow, kernel interfaces, user interfaces, calibration Infrastructure, EDM • Data management 3.6 FTE • Deploying DB services, Persistency service, Event store, geometry+primary numbers • Collections, catalogs, metadata • Application software 1.4 FTE • Geant3 + reconstruction • Infrastructure support 1.25 FTE • Librarian • Existing effort on data challenges, facilities • 4.5 FTE for T1 infrastructure/management $925k • Existing effort on Physics support: 1 FTE $100k • UM Collaboratory tools $20k Total FY03 expenditure: $3,338k J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  18. Proposed FY04 increment • Athena + Data Management • Ramps from 12 to 16.5 • 4.5 FTE priorities / work plan covered in SW talk • Facilities/DC Production • T1: (priorities discussed in facilities talk) • $390k for capital equipment • Ramp from 4.5 to 6.5 for T1 • Ramp DC production FTE from 0.9 to 2.5 • 1.5 FTE at the T1 center • 1.0 at university • This would ramp overall budget from $3.338 M in FY03 to approximately $5.2M in FY04. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  19. FY04 Budget studies 1-6 run from very bare bones to what we think is the appropriate level for US ATLAS Current projections put us at model 4 Details of the SW FTE increment covered in SW talk by S. Rajagopalan J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  20. Details of these priorities will be in the sw talk Model 6 Model 5 Model 4 Models 1-3 (increments are in production) Effect on SW FTEs in FY04 budget scenarios • 1.0 FTE in Graphics • 0.5 FTE in Analysis Tools • 1.0 FTE in Data Management • 1.0 FTE in Detector Description • 1.0 FTE in Common Data Management Software • 0.5 FTE in Event Store J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  21. If forced into a $4.7M FY04 budget • SW Cuts : • Graphics(1.0 FTE) • Data Management (1 FTE): • support for non-event data (0.5 FTE) • supporting basic database services (0.5 FTE) • Analysis tools (0.5 FTE) • Det. Description. (1.0 FTE) • Other cuts in DB/Athena jeopardize our ability to test the computing model in the DC. • Other cuts in production capability don’t allow us to run the DC. • Delay new hires 1-3 months into the year to balance the budget. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  22. The University Problem • US ATLAS has 3 National Labs • Lots of expertise, which we are effectively using • With budget pressures, little project funding left for university groups, both small and large. • On day 1, when we will extract physics from ATLAS, we NEED university groups fully involved (students, postdocs) • Solution:??? • Call on the Management Reserve • We are making a list • Will include some support for universities already working in the testbed • A little goes a long way! • Increase in base funding? J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  23. FY05 and beyond • Major management task for next few months • Assigning priorities, establish profile. • Guidance ramp up to 7155 k$ helps • But, many things ramping up in FY05: • Tier 1 • Tier 2’s ! • Software • Ramp things we cant afford in FY04 • Further ramps in things like analysis tools • Production • More DC’s  more FTE’s for production • Makes FY05 look like a tough year also. • Guidance for FY06-7 looks better J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  24. Jan 2003 LBL Review questions(1) • Facilities • Given funding shortfall, how will grid and networking support be covered • Relying on the US testbed facility (university/other labs) for some support • + 1 FTE new hire @ Tier 1 in FY04 • Network bandwidth support • Not critical for FY04: all will have OC12-192 • RHIC is only using 15-20% of the OC12 bandwidth now. • Coherent plan for grids (testing…) • Not clear on the confusing relationships that exist between projects within ATLAS and external grid • Grid3 Task force: Rob Gardner, Rich Baker. • Aligns all our efforts leading up to DC2: Aug. BNL tutorials, SC2003 demo, Pre-DC2 tests, DC2. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  25. Jan 2003 LBL Review questions(2) • SW • US ATLAS must take care to properly plan across its operation the impact on US-specific efforts of taking on new central tasks. • We see the new international ATLAS management helping us here • Find extra, non-US help • Our scrubbed WBS is our guide! • Athena-Grid integration should be a priority • Is seen as important in overall ATLAS org. chart • NSF DAWN funding should solve this. • concerned about the necessary reliance of ATLAS on the newly formed LCG • ATLAS fully committed to LCG software. We see “value added” in the LCG applications area. • Grids: some worries. We have viable US grid projects delivering middleware • Will emphasize inter-operability. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  26. Jan 2003 LBL Review questions(3) • Many milestones • No baseline • How do trade-offs impact the overall schedule • We have redefined the WBS and milestones • Many fewer milestones; aligned with our quarterly reporting to facilitate tracking J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  27. Project Management comments from Jan. 2003 review(1) • The scope should be more formally defined with software agreements. • SW agreements have been “on hold” in ATLAS for over 1 yr. • Is working for Athena; ADL experience has made most wary. • Probably makes sense for our DB effort. Will pursue… • US ATLAS should continue to be wary of taking on additional projects from International ATLAS • Working with ATLAS (DQ/DB + planning officer) to make sure tasks are covered by international effort • Our US WBS scrubbing sharply defines our tasks/deliverables. • The project managers could benefit from increased use of traditional project management tools • MS Project. Demonstrated here today. BNL project management office helping us. • Weekly phone meetings with all managers down to level 2 • Keeps all informed, on the same page, engaged. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  28. Project Management comments from Jan. review (2) • It is important to have some personnel at CERN to assist the US ATLAS members • Always been our priority, BUT, not as high as maintaining sw development team. • LBL has always had ~1 at CERN. Currently 2 (D. Quarrie, M. Marino) • BNL has P. Nevski, T. Wenaus. • UM has 1 (base): S. Goldfarb, Muon sw coordinator. J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

  29. Conclusions • New management in place • Working well! • New WBS • Project funded parts scrubbed. • Scope, near-term deliverables well-defined • Working on long term and overall ATLAS planning • Working on non-project funded parts • Budget pressure still hurts • SW scope smaller than we think appropriate • Facilities ramping slowly • University support lacking J. Shank US ATLAS Project Management. DOE/NSF review of LHCC Computing

More Related