nasa s focus on improving performance presentation to the ndia may 21 2009 n.
Skip this Video
Loading SlideShow in 5 Seconds..
NASA’s Focus on Improving Performance Presentation to the NDIA May 21, 2009 PowerPoint Presentation
Download Presentation
NASA’s Focus on Improving Performance Presentation to the NDIA May 21, 2009

Loading in 2 Seconds...

play fullscreen
1 / 22

NASA’s Focus on Improving Performance Presentation to the NDIA May 21, 2009 - PowerPoint PPT Presentation

  • Uploaded on

NASA’s Focus on Improving Performance Presentation to the NDIA May 21, 2009. Sandra Smalley Acting Director, Engineering and Program/Project Management Policy Office of the Chief Engineer Big Picture.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

NASA’s Focus on Improving Performance Presentation to the NDIA May 21, 2009

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
    Presentation Transcript
    1. NASA’s Focus on Improving PerformancePresentation to the NDIAMay 21, 2009 Sandra Smalley Acting Director, Engineering and Program/Project Management Policy Office of the Chief Engineer

    2. Big Picture • NASA's mission is to pioneer the futurein space exploration, scientific discovery, and aeronautics research. • NASA employs a strategic management approach requiring all organizations to manage requirements, schedule, and budget according to aprogram/project management methodbuilt on NASA’s Governance Modeland NASA’sCore Valuesand GuidingPrinciples.

    3. Multi-Tiered approach to cost, schedule and performance assessment • Enhanced Policy • New Agency policy for Acquisition • Strengthened strategic integration • Application of Joint Confidence Levels • Existing policy for Program/project management • Provides framework and requirements for Program and Project Management • Established standard WBS and common project life cycle for Space Flight Projects • Currently being revised to flow down new acquisition policy • Enhanced Review process • Engages all levels (Project, Program, Mission Directorate, Agency) • Key Decision Points • Reviewed Independently • Periodic reviews (monthly, quarterly, annually)

    4. Enhanced PolicyImplemented New Acquisition PolicyMaintain and enhance Existing Program/Project Management Policy

    5. New Agency Policy for Acquisition Defines acquisition as theprocess for obtaining the systems, research, services, construction, and supplies that the Agency needs to fulfill its mission. Acquisition—which may include procurement (contracting for products and services)—begins with an idea or proposal that aligns with the NASA Strategic Plan and fulfills an identified need and ends with the completion of the program or project or the final disposition of the product or service. 2-18-09

    6. Highlights of the new Acquisition Policy Goal – Document Agency’s best practices and improve the Agency’s performance against external commitments forspace flight and Information Technology programs/projects Strengthened the Acquisition Process: • Added early involvement by senior management • Acquisition Strategy Planning (ASP) meeting • Ensures consistency with Vision, Strategic Plan, and Agency budget request • Assigns program/project to a Center • Directs major partnerships • Acquisition Strategy Meeting (ASM), early in Formulation • Reviews make-or-buy decisions • Approves acquisition strategy • RetainedProcurement Strategy Meetingsprior to release of major RFPs/contracts • Complies with all FAR and NASA FAR requirements

    7. Highlights of the new Acquisition Policy Continued… Improve estimating and performance: • Programs and projects in Implementation are to be baselined or rebaselined and budgeted based on a joint cost and schedule probabilistic analysis. • Programs are to have a confidence level of 70% or the level approved by the decision authority of the responsible Agency-level management council. • As a minimum, projects are to be funded at a level that is equivalent to a confidence level of 50% or as approved by the decision authority of the responsible management council. • Requires consistency among the approved cost estimates, funding strategy, the NASA budget, and external commitments 2-18-09

    8. Joint Confidence Level Overview Policy Overview: All space flight and information technology programs shall develop a joint cost and schedule probabilistic analysis and be baselined or rebaselined and budgeted such that there is a 70 percent probability of achieving the stated life cycle cost and launch schedule Applicable decision authorities may approve a different joint confidence level Projects are to be baselined or rebaselined and budgeted at confidence level consistent with the program’s confidence level At a minimum, projects are to be funded at a level that is equivalent to a confidence level of 50 percent, or as approved by the applicable decision authority Programs or project’s proposed cost and schedule baselines are to be assessed by an independent review team Mission Directorates or Mission Support Offices are to confirm that program and projects’ life cycle cost estimates and annual budget submissions are consistent JCL Defined • JCL is a… • - Joint (cost and schedule) Confidence Level is the probably that cost will be equal or less than the targeted cost AND schedule will be equal or less than the targeted schedule date • - Process and product that helps inform management the likelihood of a projects’ programmatic success • - Process that combines a projects’ cost, schedule, and risk into a complete picture • JCL is not a… • - Specific methodology (e.g. resource loaded schedule) • - Product from a specific tool (e.g. @RISK) 8

    9. Existing Policy for Program/Project Management Three program and project management policy documents governing the following portfolios of projects: • Space Flight • Information Technology and Institutional Infrastructure • Research and Technology Highlights include: • Standardized project life cycle for Space Flight Projects • Established key decision points • Established review structure (dependent and independent)

    10. Standard Program and Project Life Cyclefor Space Flight Projects • Provides a uniform life cycle for human and robotic missions • Common process flow, uniform phases, and KDPs • Disciplined review structure for technical requirements and implementation plans • 5 key elements in execution of the life cycle: • Key Decision Points • Decision Authority role as gatekeeper • Required independent reviews • Required life cycle gate products • Center Management Councils and Governing Program Management Councils role in life cycle process

    11. Project Life CycleSimplified FORMULATION IMPLEMENTATION A B C Pre-A D E F Project Phases Concept & Technology Development Sys. Assembly, Test, & Launch Concept Studies Preliminary Design & Tech. Comp. Final Design & Fabrication Ops. & Sustainment Closeout E A C D B F Key Decision Points Mission Concept Review Systems Requirements Review Major Reviews Mission Definition Review Preliminary Design Review Critical Design Review Systems Integration Review Operational Readiness Review Flight Readiness Review Post Launch Assessment Review Decommissioning Review

    12. Project Assessments(Independent Life Cycle Reviews and Periodic Reviews) FORMULATION IMPLEMENTATION A B C Pre-A D E F ProjectPhases Concept &TechnologyDevelopment System Assembly,Test, & Launch ConceptStudies PreliminaryDesign & Tech. Comp. FinalDesign & Fabrication Ops. &Sustainment Closeout E A C D B F Key Decision Points Independent Life Cycle Reviews (includes independent standing review board assessment) Periodic Performance Assessments (assessed monthly/quarterly/ annually at all levels: Project, Program, Center, Mission Directorate, Agency)

    13. Independent Life Cycle Review Overview The review of programs and projects at each life cycle milestone by competent individuals who are not dependent on or affiliated with the program/project to objectively assess: • The adequacy and credibility of the technical approach (including but not limited to: requirements, architecture, and design), • Schedule, • Resources, • Cost, • Risk, and • Management approach; • Progress against the Program/Project Plan; • Readiness to proceed to the next phase; and • Compliance with NPR 7120.5 and 7123.1 requirements.

    14. Independent Life Cycle Assessments Process Standing Review Board (SRB) Program/Project Internal Reviews, Baselines, Summary Packages Establish Convene Provide Terms of Reference Decision Authority Report Other Relevant Individuals(e.g., Technical Authorities, Program Analysis & Evaluation) Program/ Project Center Management Council(CMC) Disposition of Findings Assessment Program Management Councils (PMC) Recommendations Decision Authority Decision on Readiness for Next Phase

    15. Why Have A Life Cycle Independent Review Process? To provide: • Theprogram/projectwith a credible, objective assessment of how they are doing. • NASAsenior managementwith an understanding of whether • The program/project is on the right track, • Is performing according to plan, and • Externally-imposed impediments to the program/project’s success are being removed. • A credible basis for a decisionto proceed into the next phase. • The independent review also provides additional assurance to external stakeholders that NASA’s basis for proceeding is sound.

    16. Periodic Performance Reviews Baseline Performance Review (BPR) Mission Directorate, Program, Project, and Institutional AgencyLevel Monthly/Quarterly Review (Assess cost, schedule, technical, risk, cross cutting technical, cross cutting institutional,…etc) Mission Directorate (MD) and Center Program and Project Center Level Monthly/Quarterly Performance (Assess cost, schedule, Technical, risk and impact to porfolio/program/center…etc.) Project Project Level Monthly (Assess cost, schedule, technical, risk, …etd) Contractor Performance

    17. “Top-Level” BPR Process Multiple Perspectives Independent Assessment of Programs, Projects & Mission Directorates Programs and Projects Input: Template Questions, Data and reviews AA Pre-Brief and Post-Brief Mission Directorate & Project Response Cross Cutting Technical & Non-Technical Issues BPR Meeting Mission Directorates and Centers Input: Template Questions, Data and reviews Special Topics Center Assessment of Programs & Projects And Functional Area Mission Support Input: Template Questions, Data and reviews Mission Support Assessment of Programs & Projects And Functional Area Actions/ Feedback 1st of the month Approximately 3 weeks… DRAFT V3

    18. Mission Directorate Overview May 2009 April May Y Y EXAMPLE Status: • 5 projects rated Red overall: Project Names would be provided here • 1 project rated Yellow overall: Project name would be provided here • Project A successfully underway; Project B successfully launched Schedule Technical Program A: Project Y Observatory Cryo-Thermal performance issues. Program B : Inability to manufacture replacement boards for Project X’s S/C Single Board Computer. Project says: "Considerable residual concern remains." Program C : Project Z actuators. All: Continued difficulties developing instruments. Program A : Project A milestones slipping; Project Z slack margin and instrument beta and gama deliveries. Program C: Project D Instrument Alpha has consumed all schedule margin; Project X launch delay & potential for more from SBC issues. Programmatic Cost $47M in Project X APA reserves, needed for 70% Confidence level, was moved from FY'10 & '11 to after Program A : Funding of Project Y slip limiting division's Program B : Project A & Project B FY'09 reserves low. Program C : Overall division budget issues. All: Access to Space continues to be an issue: Rocket B replacement; Rocket A failure, & impact on Project X; Very crowded launch manifest in fall 2011. Conflicts for launch processing facilities. • Project A - Difficult cost vs risk tradeoff choices. • Project C -- Cost impact on project if ULA launch delay is realized. Watch list :Items

    19. EXAMPLE Mission Directorate Programs and Projects Implementation Formulation C Feb Mar Apr D Feb Mar Apr A Feb Mar Apr B Feb Mar Apr PROGRAM/ PROJECT E Feb Mar Apr Assessments Roll-up Sample DRAFT V3

    20. Feb Mar G R Y G G G Project Status:Project X Y Y EXAMPLE Status: In Formulation Schedule: • Technical: Y Y • Working Thrust Oscillation concerns: active controller dropped, passive system baselined but Project considering additional isolator (between the upper stage and service module, which could impact the stack height) (Y) • First Stage has implemented 100% tangential x-ray NDE on castings • Project XTower Recontact/Plume Damage (Y) • AB-1 motor test schedule moving toward August 2009 due to FSM 17 review • Delta PDR scheduled for Thrust Oscillation in Fall 2009 Programmatic: Cost: • A number of contract actions are in worked to align elements with program guidance. Planned for execution beginning in the March 09 timeframe • Partner rayon has become available and NASA has an opportunity to procure the material • Unclear as to how the Integrated PlanningMtg09 results will affect cost confidence level (Y) • Additional $16M was provided by HQ to resolve the CR issue. Improved cost from Red to Yellow

    21. BPR Benefits/ Results • Focused NASA Senior Leaders as an “Execution Board of Directors” • Quick response to emerging issues • Course corrections in implementation of PMC, SMC decisions • Identifycross-cutting issues and action plans e.g., software complexity, instrument acquisition, etc. • Resolution of organizational communication issues (Field Centers/ Programs/ Mission Directorates/ etc.) • Identify mission support issues e.g. procurement workforce

    22. Conclusion NASA recognizes the following as key elements for continued performance enhancement: • Clear policy and standards • Engaged leadership and team members at all levels • Review processes for periodic status checks • Variety of tools to assess performance and address issues • Knowledge and experience to know when to apply the appropriate tool and how to interpret the results