1 / 19

Developmental Test and Evaluation Committee Status and Plans February 2014

Developmental Test and Evaluation Committee Status and Plans February 2014. Beth Wilson, Raytheon Steve Scukanec, Northrop Grumman Industry Co-Chairs. DT&E Committee 2014 Status and Plans. Cyber Testing Guidelines. Project Lead: Dave Desjardins Joint Project with ITEA.

zared
Download Presentation

Developmental Test and Evaluation Committee Status and Plans February 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developmental Test and EvaluationCommitteeStatus and PlansFebruary 2014 Beth Wilson, Raytheon Steve Scukanec, Northrop Grumman Industry Co-Chairs

  2. DT&E Committee2014 Status and Plans

  3. Cyber Testing Guidelines Project Lead: Dave Desjardins Joint Project with ITEA

  4. Chief Developmental Tester The Secretary of Defense shall require that each major defense acquisition program be supported by— ‘‘(A) a chief developmental tester; and ‘‘(B) a governmental test agency, serving as lead developmental test and evaluation organization for the program. • Coordinating DT&E activities • Insight into Contractor activities • Oversee T&E activities • Inform government PM about contractor DT&E results Project Lead: Joe Manas

  5. Reliability Testing • Reliability Testing Focus • Software reliability standard • Cyber testing initiatives • Data for reliability growth measures • Input/Feedback Opportunities: • IEEE meetings January and August • Periodic revision periods Project Leads: Lou Gullo, TazDaughtrey

  6. DT&E Committee - 2014 Task PlanProjects Working Group • Proposed 2014 Tasks: • Cyber Testing Project • Chief Developmental Tester Project • Collaboration with IEEE Reliability • Joint meetings with SED Committees • Modeling and Simulation • Architecture • Systems Security Engineering • Education and Training • Deliverables/Products • Cyber Testing Recommendations white paper • Chief Developmental Tester white paper • Candidate KSABs for role (E&T) • Industry interaction recommendation • Tutorial on role for SE conference • Inputs to reliability standards effort • Joint meeting notes • Schedule / Resources • Cyber Testing Draft Recommendations Feb 2014 • Weekly teleconferences started 12/3/2013 • Full white paper later in 2014 • Chief Developmental Tester • Joint meeting with E&T with SED meeting • Tutorial at SE conference • Reliability standards input for Jan/Aug meetings • Test as a Stakeholder joint meetings with SED • Modeling and Simulation • Architecture • Systems Security Engineering • Issues / Concerns: • Industry engagement • NDIA T&E conference may be cancelled (Seattle in March) • ITEA conference (Denver in September) may be postponed and moved

  7. Backup DT&E Committee Results 2008-2013

  8. DT&E Committee Current Structure Since 2010 NDIA Systems Engineering Division NDIA Test and Evaluation Division SED Committee Collaborative Projects T&E Themes For Conferences NDIA Industrial Committee on Test and Evaluation (ICOTE) NDIA DT&E Committee Project Results DT&E Themes SE Themes & Issues T&E Themes & Issues NDIA Systems Engineering Conference (Q4) Collaborative Projects NDIA Test and Evaluation Annual Conference (Q1) Periodic T&E Committee-Focused Conferences Cooperation with other NDIA Divisions DT&E Committee Focus: T&E initiatives aligned with SE, DT&E

  9. Summary of DT&E Committee Test and Evaluation Efforts

  10. Summary of DT&E Committee Systems Engineering Efforts

  11. DoD T&E Policy Study • August 2006: DT&E Committee Kickoff • Policy Study: • “Improving T&E in the DoD Acquisition Process” • Industry T&E policy recommendations • Workshops: • August 2007 • January 2008 • Focus Areas: • 1. Earlier contractor and tester involvement • 2. Integrated DT/OT and DT operational relevance • 3. Suitability • April 2008: Report Summarized Results: • 10 Findings • 15 Recommendations 2006-2008

  12. Integrated Testing (CT/DT/OT) Implementation Framework 2008-2010

  13. Software Test and EvaluationSoftware Summit 2009

  14. RFP Language Industry Comments for Update: Incorporating Test and Evaluation Into Department of Defense Acquisition Contracts Recommendations from SW Summit 2010-2011

  15. Test and Evaluation forSystems of Systems 2009: “Sleepless Nights” List of Issues 2010: “Sominex” Resulting Initiatives 2010: Workshop 2011: Best Practices Wave Model 2012: Final Report 2009-2012

  16. Effective Use of Modeling and Simulation for Test and Evaluation • Joint Meeting in August 2011 • Distributed Testing, the Joint Mission Environment Test Capability (JMETC) and the Test and Training Enabling Architecture (TENA) • DoD M&S Community of Interest Data Management Working Group • LVC Architecture Roadmap Implementation (LVCAR‐I) Gateways Effort Applicability to T&E • OSD T&E Working Group • Raytheon Presentation on M&S for T&E • Potential Topics for November AMSWG Meeting 2011

  17. Modeling & Simulation forDistributed Testing • Benefits: • Find integration issues earlier • Test to learn in ‘safe’ environment • Protect proprietary information • Facilitate DT to OT transition • Increase performance testing range in operating environments • Support end to end studies throughout the program • Barriers: • Security • Lack of persistent network • Early consideration of technical issues • Perceived value • Disconnect between the communities (M&S and T&E) • Recommendations: • Harmonize the standards for M&S and Test for the life cycle perspective (HLA, TENA, Metadata) • Create a framework for reusing and repurposing M&S through the product model • Establish M&S as part of statistical test design • Determine what tests are conducted to acquire data for model validation. • Fewer test events with better models. • Recommend the use of M&S to do I&T • Recommend establishment of JMETC as a persistent node for industry to engage in MBDI&T Joint Meeting August 2012 Joint Track SE Conference October 2012 2012

  18. Statistical Test Optimization • 2012 SE Conference summit/workshop thread • Tutorials on Monday 10/22 • Presentations on Wednesday 10/24 • Synthesis Panel on Wednesday 10/24 • Follow-on Results: • DT&E Committee white paper • ITEA Journal article • SE Conference track Oct 2013 • MORS Conference Nov 2013 2012 - 2013

  19. Requirements VerificationLeading Indicator Metrics • 2012 SE Conference workshop • Follow-on workshop with System Performance Measurement WG • Focused on Requirements Verification not addressed in first report • Potential Measures Identified: • System Maturity Level • Verification Requirement Maturity • Technical Measures and Stakeholder Need 2012 - 2013

More Related