1 / 34

Evaluate SE Methods, Processes and Tools Technical Task Plan

Evaluate SE Methods, Processes and Tools Technical Task Plan. USC Workshop Los Angeles, CA 29 January 2009. Agenda. Overview and changes in the MPT task Near term activities Sponsor environment Revised schedule Workshop activities. SOW Language.

Download Presentation

Evaluate SE Methods, Processes and Tools Technical Task Plan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 29 January 2009

  2. Agenda • Overview and changes in the MPT task • Near term activities • Sponsor environment • Revised schedule • Workshop activities

  3. SOW Language Look at current SE methods, processes, and tools (MPTs) as they are applied across the DoD acquisition life cycle focusing on three different development environments: individual weapons systems, SoS, and network-centric systems. Research will be targeted at improving current/identifying new SE MPTs that will better support the practice of SE in these three environments. Specifically, this task will: 1. Define critical attributes of current SE MPTs across the weapons system, SoS, and network-centric services environments; 2. Identify strengths and weaknesses for these current MPTs and any shortcomings in their application across DoD; 3. Recommend, in priority order, MPTs for further study to innovate or create improved or new MPTs to eliminate identified shortcomings; 4. Upon selection by the government of MPTs recommended in sub-task 3 for further study, perform research to innovate or create improved or new MPTs to eliminate identified shortcomings, thereby advancing the state of practice of SE within the community; and 5. For the improvements delivered in sub-task 4 above, propose a methodology for validating the programs.

  4. MPT Task Overview 3.3.1 Establish Criteria and Validate with Sponsor Identified Raw MPTs 3.3.2 DoD GuideboooksDoD Programs/ReviewsService RepositoriesDefense IndustryCommercial industry MPT Sources Select MPTs for Evaluation Apply Selection Criteria Queue of selected and prioritized MPTs H H H M M L H H H M M L Complete Detailed Attributes Describe MPTs 3.3.3 eWorkshop Fully described MPTs to evaluate Apply Evaluation Criteria Evaluate MPTs BPCH Evaluated MPTs To Users 3.3.4 3.3.5 Cumulative MPT Coverage MPT Analysis Reports and Recommendations Repeat Recommended MPTsImprovements NeededOverall Gap AnalysisResearch Areas 3.3.6

  5. Systems Engineering andLevel-of-Effort Contracts Dennis Barnabe, SERC PM 21 Nov 2008 SERC Kickoff Meeting

  6. Slippery Slope Logic • Mission-card • Agility • Prototype/Discovery • LOE Contract

  7. Relationships to Agility Low Agility/Speed Contract Type LOE/T&M/TTO Delivery/Turnkey Project Type Prototype QRC + O&M LRIP + O&M Prod + O&M High L Requirements Detail H Local Mission Satisfaction Broad L Maintainability H H Redundancy Risk L L Scalability H L Complexity/Size H L Integration/Interoperability H

  8. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing Prototype or Discovery

  9. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing QRC

  10. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing Development with eye toward sustained Ops

  11. (Usual) LOE SE Implications • Requirements lacking • Limited (if any) ‘formal’ Reviews • No coordination/insight among related efforts • Interface and duplication risks • No ability to assess technical health • Standards application, etc • No formal ‘transition’ planning • What if it works? • Build to Cost • No actual cost estimate of satisfying mission need • If successful, Operations cuts into Development • Deemed ‘tech transfer issue’ • Schedule lacking • Inability to coordinate among other efforts • “Success” defaults to ‘what is delivered’

  12. MPT Task Overview 3.3.1 Establish Criteria and Validate with Sponsor Identified Raw MPTs Changes Required 3.3.2 DoD GuideboooksDoD Programs/ReviewsService RepositoriesDefense IndustryCommercial industry MPT Sources Select MPTs for Evaluation Apply Selection Criteria Queue of selected and prioritized MPTs H H H M M L H H H M M L Complete Detailed Attributes Describe MPTs 3.3.3 eWorkshop Fully described MPTs to evaluate Apply Evaluation Criteria Evaluate MPTs BPCH Evaluated MPTs To Users 3.3.4 3.3.5 Cumulative MPT Coverage MPT Analysis Reports and Recommendations Repeat Recommended MPTsImprovements NeededOverall Gap AnalysisResearch Areas 3.3.6

  13. Changes to identification process1 • Guidance: • Focus on IC environment (context) changes strategy to initially leverage BPCh Content Provider Network (CPN) • Requires different candidate MPT collection strategy based on IC context and requirements • New Strategy: • Extend context attributes of current MPT description to support definition of IC environment • Define and validate IC environment and requirements using a revised MPT description template with extended context attributes • Compare to other environments (contexts) and where similarities are found, mine environment for MPTs

  14. Changes to identification process2 • Tactics: • Develop initial set of context attributes and values that characterize NSA environment based on current understanding • Revise current MPT template to include extended attribute list, MPT requirements, information summary, selection recommendation and support rationale • Validate attributes and practice criteria in requirements interviews with NSA personnel (critical) • Use template for 3-pronged MPT identification efforts • Review sources provided in SOW and BPCh CPN • Review open literature and web-based sources • Capture current applicable NSA MPTs 1 and 2 can begin when initial template is available; 3 depends on sponsor participation and ability to coordinate schedules/access • Adapt selection criteria and process to using the new template

  15. MPT Task Changes Establish Preliminary Sponsor Environment and Needs Identify and Mine Comparative Environments for MPTs Develop Extended MPT Template Review sources provided in SoW Review literature and web Access experts through team Extended environmental attributes Validate Environment and Needs with Sponsor Conduct interviews Initial MPT Candidate Templates Refine Templates and Select MPTs for Evaluation Queue of selected and prioritized MPTs H H H M M L

  16. MPT Identification/selection Activities1 • Establish Preliminary Sponsor Environment and Needs • Develop initial MPT evaluation/characterization template • Revise and extend proposed attribute set • Extend context attributes • One-page template for candidate identification • Validate Environment and Needs • Interview sponsor personnel • Describe the type of people to interview • Develop interview structure based on the template • Revise preliminary attributes, values and needs and capture new ideas • Revise/extend template as needed • Revise evaluation criteria based on needs assessment Italics indicate tasks of the workshop

  17. MPT Identification/selection Activities2 • Gather MPT candidates from broader community based on environmental description • Identify best approach for this • First target is INCOSE Workshop next week • Identify comparable environments • Through literature, web and expert inputs, identify development/acquisition/deployment environments that share attribute values with validated sponsor environment • Mine comparable environments for candidate MPTs • Review SOW-specified sources • Review comparable environments as they are identified for candidate MPTs

  18. MPT Identification/selection Activities3 • Select MPTs for evaluation • Review candidate templates • Refine and extend template descriptions for promising candidates • Select evaluation candidates

  19. Initial environment description • The NSA environment can be described as • A short development cycle to meet quick response needs with lowered quality requirements at initial deployment • Evolutionary deployment strategy that may begin with limited deployment at relatively low-quality and evolve into broader deployment at higher quality • High level of interdependency with existing products • “Mashing” and expanding of results from other projects to create new results • Providing new results for further processing by others • Modifying existing capabilities to meet rapidly changing constraints and/or availability of different data • High level of glueware

  20. Original MPT Attributes MPT Attributes – What changes are needed?

  21. Proposed Revised Schedule

  22. MPT Activities during workshop • First Session • Clean up environment description (Rich, Ken) • Less geek language – more general description • Is there a taxonomy to help with completeness? • Hopefully discuss with customer • Determine and define attribute changes (including values) (Forrest) • Develop the MPT mining template (Paul?) • Second Session • Brainstorm MPT mining activities (Paul) • Opportunities, “helper” groups (INCOSE, Redstone SE group, etc.) • Methodology • Build necessary instruments for INCOSE (Forrest) • Possible extra session after the SERC reception tonight? • Possibly at Radisson or near airport (depending on majority)

  23. Backup

  24. Systems Engineering andLevel-of-Effort Contracts Dennis Barnabe, SERC PM 21 Nov 2008 SERC Kickoff Meeting

  25. Slippery Slope Logic • Mission-card • Agility • Prototype/Discovery • LOE Contract

  26. Relationships to Agility Low Agility/Speed Contract Type LOE/T&M/TTO Delivery/Turnkey Project Type Prototype QRC + O&M LRIP + O&M Prod + O&M High L Requirements Detail H Local Mission Satisfaction Broad L Maintainability H H Redundancy Risk L L Scalability H L Complexity/Size H L Integration/Interoperability H

  27. (Usual) LOE SE Implications • Requirements lacking • Limited (if any) ‘formal’ Reviews • No coordination/insight among related efforts • Interface and duplication risks • No ability to assess technical health • Standards application, etc • No formal ‘transition’ planning • What if it works? • Build to Cost • No actual cost estimate of satisfying mission need • If successful, Operations cuts into Development • Deemed ‘tech transfer issue’ • Schedule lacking • Inability to coordinate among other efforts • “Success” defaults to ‘what is delivered’

  28. Tailoring vice Avoidance Operational/Functional Requirements Iterative Acquisition Technical Requirements Development As-Built capabilities /Performance 90 Days 90 Days Deploy ? Discovery QRC Full O&M Limited O&M Deployments Operational Baseline

  29. Right Tool for the Job • LOE has its niche • SE (& Acquisition) approach must evolve as Objective changes • Prototype/Discovery • QRC • Limited Ops • Full Ops • Production

  30. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing Prototype or Discovery

  31. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing QRC

  32. SE “Equalizer” Requirements Configuration Management Life Cycle Planning Technical Reviews Technical Documentation Testing Development with eye toward sustained Ops

  33. Possible LOE SE Leverage Points • Ensure Standard Inclusions • On contract • Adherence • ‘Formal’ Gates for phase transitions • Prototype/PofC QRC Limited Ops Sustained Ops • Evolve SE Processes appropriately for given Phase • TTOs must be written to support

  34. Possible new contextual attributes • Brainstormed attribute list with values where available – to be refined! • Criticality for meeting requirements (QRC-high, QRC-low, high, medium, low) • Volatility/evolution of requirements (High (>1%/month), Normal(.01-1%/month), low (<.01%/month) • Level of quality required at deployment (functional, reliable, critical) • Level of security required at deployment (SCI, Classified, Unclassified) • Dependence on other systems for critical data and functionality (Very high, high, medium, low, none) • Need to coordinate among other efforts • Assessability of technical health (health of data sources required?) • Length/stability of life cycle • Stability of life cycle definition (phases) • Evolution/stability of required ceremony in response to system life cycle needs – how do I prepare enough ceremony up front to be able to make adjustments easily when system maturity/deployment change – nondeterministic? • Breadth of applicability • Uniqueness of application (are 3 people already doing this and you don’t know it) • Scalability – in function and number of copies deployed • Level of transition planning required

More Related