1 / 67

Closing the Loop for ISP using Performance Prediction Dec-05

Closing the Loop for ISP using Performance Prediction Dec-05. Greg Arnold, Ph.D. Gregory.Arnold@wpafb.af.mil Sensors Directorate Air Force Research Laboratory AFRL/SNAT, Bldg 620; 2241 Avionics Circle WPAFB OH 45433-7321; (937) 255-1115x4388. Trilogy of Thoughts / Goals.

auryon
Download Presentation

Closing the Loop for ISP using Performance Prediction Dec-05

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Closing the Loop for ISPusing Performance PredictionDec-05 Greg Arnold, Ph.D. Gregory.Arnold@wpafb.af.mil Sensors Directorate Air Force Research Laboratory AFRL/SNAT, Bldg 620; 2241 Avionics Circle WPAFB OH 45433-7321; (937) 255-1115x4388

  2. Trilogy of Thoughts / Goals • Playground- Urban SASO (Security & Stability Ops) • ISP- context is UAV swarms & S-S fusion • Need multiple sensors • Confirmatory Sensing and Interrogation • Anomaly detection & backtracking • Understand the problem • Active Vision- manipulate the sensor to improve perf • Offline: ATR-driven sensing • Online: Time-reversal. Active filter. Gotcha, • ATR Theory- performance prediction is the key! • Reasoning in 3D (requires metrics) • Images are samples from world • General => Specific for robustness Uninhabited Air Vehicle

  3. Coarse Fine Target RecognitionLevels of Discrimination • Detection: the level at which targets are distinguished from non-targets, • i.e., clutter objects such as trees, rocks, or image processing artifacts. • Classification: the level at which target class is resolved, e.g., building, • vehicle, or aircraft. • Recognition: the level at which target subclass is determined, e.g., for • a tracked vehicle, tank, APC, or ADU. • Identification: the level at which the model/make of a target is resolved, • e.g., for a tank: M60, M1, or T72. • Fingerprint: the serial number of a particular instance of a target, • i.e. Vince’s Caravan vs. Lori’s Caravan.

  4. Target RecognitionLevels of Automation • Interactive decision aid: • Human and machine work interactively • Automatic decision aid: • Machine is autonomous from input of data to output to human • Human makes final decision (Human-in-the-loop) • Autonomous system: • Machine makes the final decision • Human is NOT in the loop

  5. What Does ATR Mean?

  6. ISR Goals Intelligence, Surveillance, and Reconnaissance • ISR Goals • No Sanctuary • Persistent (PISR) • All Weather • Day / Night • All Terrain (city, country, • forest, desert, ocean) • Moving & Stationary • Safety !!! Found Something Go Get It Shot Kill Chain Hit It!

  7. Automated Target Recognition (ATR) Insights • Information Limited: Believe current performance is information limited • Human (Data >> Information) • Pixels/Pupils ratio • Better SNR, resolution, modalities • Machine • False Alarms (Google Search) • Finer Discrim./Obscuration (>> higher resolution) • 3-D: Intuitively understand geometric (3-D) information • UAVs: UAVs transform the CID problem!

  8. Sensors Directorate Structure • SN Directorate • SNA: Sensor ATR Technology • SNJ: Light (EO) Sensors • SNR: Radio (RF) Sensors • SNZ: Applications • SNA Division: Mike Bryant, Lori Westerkamp (Ed Zelnio) • SNAA: Evaluation • SNAR: Applications • SNAS: Modeling and Signatures • SNAT: Innovative Algorithms • SNAT Branch: Dale Nelson, Rob Williams • Generation After Next Technologies & Algorithms (Greg Arnold) • Tracking and Registration (Devert Wicker) • Vigilance (Kevin Priddy)

  9. 1 PD 0 0 FAR 10 ATR Thrust Scope ID M60 TRACK FUNCTIONS GEOLOCATE FIND SENSORS Algorithms Signatures MATURATION PROCESS Assessment

  10. FIX FIND TRACK TARGET ASSESS ENGAGE ATR Thrust Approach - Subthrusts FIND FIX TRACK & ID Operational Target Models/Databases Characterized Performance High Performance Computing Operational Databases ASSESSMENT& FOUNDATION SIGNATURES & MODELING INNOVATIVE ALGORITHMS Sensor Data Management System (SDMS) Signature Center Phenomenology Exploration EM Modeling Synthetic Data Challenge Problems Standard Metrics ATR Theory

  11. Near Mid Far ATR We need a generalized pattern recognition capability that will classify things previously unseen, actively manage assets, and predict the intent and actions of combatants. • ATR for Anticipatory ISR • Multi-X fusion for PISR/TST • Dynamic GIG Sensor Management • ATR Theory for Anticipation Capability / Difficulty Spiral Development • Adaptive ATR • On-the-fly modeling / reacquisition • Reasoning with uncertainty • Adaptive metrics derived from user • 3-D ATR • 3-D Imaging for RF Floodlight • 3-D for urban context • ATR Theory challenge problem Goals

  12. Real World Dimensionality Models Data Assumptions / BeliefsBackground / Framework • Must Define Problem & EOC’s • Whether or not applying model-based vision • Necessary for testing algorithm capabilities • Model-Based Vision • More than just CAD models • Characterization of the data and the system at some level • “If I can’t model it, I don’t understand it” • Physics-Based Vision • What can we do before appealing to statistics

  13. Targets Sensors Interactions Environment Operating Conditions (OCs) OCs: Everything that changes the sensor response. Most OCs have infinite variation

  14. Real world variability:Extended Operating Conditions (EOCs) 20 Target Types Squint & Depression Angle . . . Articulation 6 DOF Pose Configuration Obscuration z Variants y x

  15. Match MSE Reflectivity Quantized Metrics Serial # Tuning Points Binary Shape Same Sensor Sensor Type Shape Synthetic Train Discrimination vs. Robustness Data Models Discrimination Robustness

  16. Challenge Space Using Information More Effectively Bio-Inspired Adaptive ATR ATR Driven Sensing Multisensor Approaches More Information 17

  17. Trilogy of Thoughts / Goals • Playground- Urban SASO • ISP- context is UAV swarms & S-S fusion • Need multiple sensors • Confirmatory Sensing and Interrogation • Anomaly detection & backtracking • Understand the problem • Active Vision- manipulate the sensor to improve perf • Offline: ATR-driven sensing • Online: Time-reversal. Active filter. Gotcha, • ATR Theory- performance prediction is the key! • Reasoning in 3D (requires metrics) • Images are samples from world • General => Specific for robustness

  18. Vertically Integrated Sensor Exploitation for Generalized Recce & Instant Prosecution (VISEGRIP)

  19. Confirmatory Sensing & Interrogation Background • Goal: Quantify the accuracy, completeness, & relevance of information with demonstrable authority. Challenge problem: support counter-WMD • Objective: Theory & algorithm research to incorporate ATR Theory principles into Sensor Mgmt infrastructure modified to implement confirmatory sensing & interrogation • Payoff: Pattern Recognition discipline that is more expressive to assure users that source is authoritative and information is “actionable” Algorithms Theory • ATR Theory • Aims to design and predict performance of sensor data exploitation systems • Includes all forms of sensor data exploitation i.e. target detection, tracking, recognition, and fusion • Information Theory • Studies the collection and manipulation of information • Query Generation • What question to ask • Query Processing • When, How, & Who to ask • Data Fusion • Align redundant information • Assess unique or contradictory information • Assimilate valuable information • Evidence Assessment • Quantify accuracy and completeness of assertions • Predict a window of opportunity

  20. Trilogy of Thoughts / Goals • Playground- Urban SASO • ISP- context is UAV swarms & S-S fusion • Need multiple sensors • Confirmatory Sensing and Interrogation • Anomaly detection & backtracking • Understand the problem • Active Vision- manipulate the sensor to improve perf • Offline: ATR-driven sensing • Online: Time-reversal. Active filter. Gotcha, • ATR Theory- performance prediction is the key! • Reasoning in 3D (requires metrics) • Images are samples from world • General => Specific for robustness

  21. ATR-Driven Sensing Cueing, Prioritization for the Human

  22. Trilogy of Thoughts / Goals • Playground- Urban SASO • ISP- context is UAV swarms & S-S fusion • Need multiple sensors • Confirmatory Sensing and Interrogation • Anomaly detection & backtracking • Understand the problem • Active Vision- manipulate the sensor to improve perf • Offline: ATR-driven sensing • Online: Time-reversal. Active filter. Gotcha, • ATR Theory- performance prediction is the key! • Reasoning in 3D (requires metrics) • Images are samples from world • General => Specific for robustness Uninhabited Air Vehicle

  23. What is your Objective Function? • Pd (Prob. of Detection) • Pcc (correct classification) • Pe (Prob. of Error) • Pfa (False Alarm) • Confusion Matrix • Precision • Recall • ROC Curve • L-p (L-1, L-2, L-infinity) • Diffusion Distance • Hausdorff • Chamfer • Ali-Silvey • Earth Movers Distance • Chi Squared • Entropy • Kullback-Liebler • Mutual Information • Maximum Likelihood • Renyi

  24. ‘Clear Box’ View of ATR 1 PD 0 0 FAR 10 Environment Detect Track Geolocate ID Sensor ATR Decisions Human Decisions Target Target Models & Database Feature Extractor Target Knowledge Discriminator Target Knowledge Decision Rule Trained Features Templates Models

  25. ATR/Fusion Processes Sensor Management Registration Adaptation Sensor Model Environment Model Behavior Models Performance Model Anticipate Environment Detect Track Geolocate ID Sensor(s) ATR Decisions Human Decisions Target Target Models & Database

  26. Performance Model is the Lynchpin • ATR System is dependent on the Performance Model • Need performance prediction • Determine where / when to use sensors • Estimate effectiveness of sensors for given task • Sensor Management • Registration • Learning • …

  27. If Somebody Asks… • Typical DARPA question • “Is it physically possible to do X?” • We’ve invested $K and achieved P% performance, is it worth investing more? • Examples • How likely are we to detect a dismount with an HSI system with 1m spatial resolution? 1ft? 1in? • We spent $40M and achieved 80% of perfection.Have we reached the knee in the performance curve? • Organization X says it can build a system to do Y. Does this violate physics?

  28. Aspects of ATR Theory Objectives Data Assessment Design System Evaluation • Measure the information content of sensor imagery • Given a set of data and a MOP, determine attainable performance range • What are the critical design constraints to achieve a desired outcome, using this data? • Estimate exploitation level of available information • Establish “feedback loop” between ATR designers and sensor developers • What are the critical design constraints to achieve a desired outcome at a particular level of confidence? • Information gain from using models and data adaptively (learning) • Determine theoretical upper bound on performance of given ATR • Given an ATR system and a set of data, determine how much information can be exploited • Determine how close a given system comes to achieving the optimal bound • What are the critical design constraints to achieve a desired outcome, using this sensor and algorithm? • What was the benefit of adding ‘this’ (additional data/processing)?

  29. Problem Simplification • Having said all that, let’s examine a problem for which we have some intuition • 4 or 5 points undergoing rotation, translation, and maybe scale and skew • 1-D, 2-D, and 3-D • Understand the projection from world to sensor

  30. What is Shape? • Pose and scale invariant, coordinate independent characterization of an arrangement of features. • Residual geometric relationships that remain between features after “mod-ing out” the transformation group action. • Captured by a “shape space” where each distinct configuration of features (up to transformation) is represented by a single point.

  31. Beyond Invariants Invariants + Projection Object-Image Relations

  32. Generalized Weak Perspective • Projection model applicable to optical images(pinhole camera) • Approximates full perspective for objects in ‘far field’ • Affine transformations on 3-space, and in the image plane (2-space) • Denoted GWP

  33. Affine Transformations • In 3D (Rotate, Scale, Skew | Translate) (3-D Point)

  34. GWP Projection3D to 2D Projection Image Object

  35. Image 1 Image 2 Object-Image Relation Motivation Object 1 • Image 1 is not equivalent to Image 2 (in 2-D) • Object 1 is not equivalent to Object 2 (in 3-D) Object 2

  36. Object - Image Relations Concept “The relation between objects and images expressed independent of the camera parameters and transformation group” (1) Write out the camera equations (geo or photo) (2) Eliminate the group & camera parameters (3) Recognize the result as a relation between the object and image invariants. But pure elimination is VERY difficult even for polynomials.

  37. Weak PerspectiveObject - Image Relations WEAK PERSPECTIVE • (Generalized) • Parallel things remain parallel • The object size is 1/10 the distance from the camera • (Standard Position Method)

  38. P4 P5 P3 P1 P2 q4 q5 q3 q1 q2 Weak Perspective Camera 3-D Model Pi={xi,yi,zi} N-points (3N DOF) Rotate,Translate,Scale,Shear (12 Constraints) 3N-12 Absolute Invariants 2-D Image qi={ui,vi} N-points (2N DOF) Rotate,Translate,Scale,Shear (6 Constraints) 2N-6 Absolute Invariants Camera Model N-points (2N DOF) Union 2-D & 3-D (8 Constraints) 2N-8 relations Need 5 corresponded points (minimum)

  39. P4 P5 P3 P1 P2 3-D Invariants 3-D Model Pi={xi,yi,zi,1} 5-points GL3+Translation (12 Constraints) 3N-12 Absolute Invariants Invariant is a function of the Ratio of Determinants: A useful standard position is:

  40. q4 q5 q3 q1 q2 2-D Invariants 2-D Image qi={ui,vi,1} 5-points GL2+Translation (6 Constraints) 2N-6 Absolute Invariants Invariant is a function of the Ratio of Determinants: A useful standard position is:

  41. Object - Image RelationGeneralized Weak Perspective Camera (2-D Standard Position) = (Camera Transform) (3-D Standard Position) Eliminate camera transform parameters: The camera transforms the first 4 object point to image points, the remaining points satisfy the object - image relation iff:

  42. Object-Image Relation Abstraction All objects that could have produced the image. Object- Image Relations All images of the object.

  43. GWP Shape Spaces • The shape spaces in the GWP case are Grassmann manifolds • In 3D • Gr(n-4,H) or dually the Schubert cycle of 4-planes in Gr(4,n) which contain (1,….,1) • Manifold has dimension 3n-12 • In 2D • Gr(n-3,H) or dually the Schubert cycle of 3-planes in Gr(3,n) which contain (1,….,1) • Manifold has dimension 2n-6 H is the subspace of n-space orthogonal to the vector (1,…,1)

  44. Why • We associate to our object data, viewed as a linear transformation from n-space to 4-space, its null space K of dimension n-4. • Likewise to our image data in 2D we associate the null space L of dimension n-3.

  45. Global Shape Coordinates • Better than local invariants • Come from an isometric embedding of the shape space in either Euclidean space or projective space. • Matching expressed in these coordinates will gracefully degrade

  46. Example in GWP • 3D, n = 5 feature points • Global shape coordinates are the Plucker coordinates (or dual Plucker coordinates) of the 4xn object data matrix or the 3xn image data matrix.

  47. Global Object-Image Relations • General • If and only If conditions • Overdetermined set of equations • GWP • To match, K must be contained in L (iff) • This incidence condition can be expressed in terms of the global shape coordinates • For n=5, 10 (non-independent) relations that look like: [1234][125]-[1235][124]+[1245][123] Locally only 2 of the 10 are independent, because the locus V of matching pairs (object shape, image shape) in the 7 dimensional product space XxY has dimension 5, codimension 2.

  48. Beyond Object-Image Relations Object-Image Relations + Matching Object-Image Metrics

More Related