Meeting the Challenges of Unmanned and Autonomous System Test and Evaluation. Thomas Tenorio, Subject Matter Expert for UAST Executing Agent 10 March 2010, USC. Activities. Working Group Roadmapping , Surveys, Networking, Tech Eval BAA Supply Space Surveys UAST Roadmap
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Thomas Tenorio, Subject Matter Expert for UAST Executing Agent
10 March 2010, USC
Tagged as National Asset
Major Range and Test Facility Base
S&T for physical test capabilities associated with Test Bed and Environment
Documented (PoR) UAS Missions and Technologies per USIR
Draft Drivers, Use Cases,
Test Concept ,Test Plans
Draft Test Requirements
Refined Drivers, Use Cases,
Test Concept ,Test Plans
Refined and Extended (beyond PoR) UAS Missions and Technologies with Working Group
T&E Needed to Test Specific UAS Technologies
Facility Specific Descriptions &
Requirements (Hard Numbers)
Test Resource Requirements
Baseline T&E Capabilities (we do not want to conduct an exhaustive survey of all the T&E capabilities that exist and could support UAS T&E)
Tri-Service Baseline Capability
Test Resource Survey
Gaps sensitive to New Approaches and New Paradigms
Draft Test Resource
Needs Analysis (Gap)
Refined Test Resource
Refined Needs Statements
BAA, RFI, White Papers
UAS Safety, Suitability, Survivability, Effectiveness
Test and Evaluation of UAS as Highly Complex Systems
What if --?
Cross-Domain Commonality ……......... Specificity
Emulating Mission & Environmental Complexity with Assured Safety
Assessing Effects and Capabilities
UAST Tools & Techniques
Reference Data Sets Ground Truth Decision & Behavior
Protocols & Design
Test Bed and Environment
Evolutionary acquisition with JUTLS & JUONS
Booming capability development
Capability challenge of 311 named systems
Majority system non-PORs
Fielding tech in months: 4-6 months for joint operational necessityEnsure test capabilities support the fielding of unmanned systems that are effective, suitable, and survivable
Next Gen Tech
Standard Systems T&E
Autonomous System T&E
Unmanned & AutonomousSystems Test
UAS Test & Evaluation Focus Areas
* Safety Considerations in all 7 Areas
Predicting Unmanned and Autonomous System Behaviors
Emulating Mission and Environmental Complexity for Assured Safety
Assessing UAS Effects and Capabilities
Autonomous Test Protocols and Design
Test Bed and Environments for UAST
UAST Reference Data Sets (Ground Truth, Decision, & Behavior)
UAST Tools and Techniques
Across UAS(s) OODA
Tools for Design of Experiments in system, multisystem, and system of systems scenarios considering also implications of mission scenarios, opposition capability, and physical context.
Ability to incorporate UAS design models into warfighter-scope models/simulations in order to anticipate mission suitability, safety, effectiveness and survivability (including countermeasures).
Determining how to manipulate live physical scenarios including Red Forces. Acquiring ground truth data during actual test operations.
Bayesian Belief Networks and similar tools for conflating test data to mission-level expectations.
Ensuring that Net Centric Systems and relevant test assets are sufficiently agile to enable span and dynamics of UAS test scenarios.
Systems architecting and engineering of a family of composable UAST’s.
What: Discover internal bugs and vulnerabilities and external incompatibilities in UAS’s, across UAS’s and in T&E systems.
Where: In executable code, source code, data bases, system models, mission simulations, and SoS configurations.
At development, integration, warfighter and depot locations.
Why: Generate warfighter-confident knowledge. Cut test cycle time and cost in half. User controllable degree of False Positives and False Negatives.
How: Code inspection. Test beds not required.
Mathematically rigorous assessment method and tools.
Enabled by next generation pattern recognition semiconductor chips with throughput ≈ 1 Gb/sec
When: TRL3@2010, TRL5 @2011, TRL6@2012, TRL9@2014
Safety Assurance *
What: Discern and Referee the contest between autonomy and safety, both a) test safety, including, e.g., FAA, and b) Operational safety, e.g., fratricide and innocent civilians.
Enable both static and evolving limits.
Assess efficacy of UAS Self-test capability, resilience to cyber threats, probable error in M&S evaluations of UAS(s).
Where: Throughout 5000.02 phases and Warfighter stages. Across UAS, UAS’s, SoS. Spans both on-board and administrator functions.
Why: Avoid unintended consequences of UAS operations. Generate warfighter-confident knowledge. Cut test cycle time and cost in half. User controllable degree of False Positives and False Negatives.
How: A ‘Do No Harm’ OODA loop inside the autonomy loop of both the UAS and the UAST. Method for Preempting UAS behaviors, separate from Planner capability, preferably non-destructive.
When: for autonomy Level 1@2010, 2@2011, 3@2012, 4@2013, 5@2014