Recognition System Capacity - PowerPoint PPT Presentation

javier
recognition system capacity n.
Skip this Video
Loading SlideShow in 5 Seconds..
Recognition System Capacity PowerPoint Presentation
Download Presentation
Recognition System Capacity

play fullscreen
1 / 35
Download Presentation
98 Views
Download Presentation

Recognition System Capacity

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Recognition System Capacity Joseph A. O’Sullivan Samuel C. Sachs Professor Electronic Systems and Signals Research Laboratory Department of Electrical and Systems EngineeringWashington University in St. Louis (314) 935-4173; http://essrl.wustl.edu/~jaojao@wustl.edu Michael D. DeVore, UVA Naveen Singla Brandon Westover Supported in part by the Office of Naval Research Adaptive Sensing MURI Review, 06/27/06

  2. Recognition System Capacity • Motivation: • ATR; Network Centric Warfare; Biometrics; Image Understanding • Active Computations • Achievable Rate Regions • Inner and outer bounds • Successive refinement

  3. Why Theorems? • ONR Perspective: Want Systems That Work • Implementable on projected system architecture • Good performance • Our Perspective: Theorems Provide • Provable performance: bounds and guidelines • Validation and critique of existing system designs • Motivation for recognition system design: system architectures; database design; optimal compression for recognition; communication for recognition; active computations • Growing Awareness of Importance of Information Theory

  4. Perspective on Image Understanding • “Finding tanks is so World War II.” Bob Hummel, DARPA program manager, ATR Theory Workshop, Dec. 2004 • What make of car? • What year? • Who is driving? • Where has it been? • Improvised Explosive Devices (IED) • Demand more information from imagery

  5. Biometrics Must Be • Universal • Permanent • Unique • Measurable Uniqueness  How unique? Bits. Measurability  How measurable? Bits. Encoding A. K. Jain, et al., “Introduction to Biometrics,” 1999 John Daugman, http://www.cl.cam.ac.uk/users/jgd1000/


  6. Recognition System Capacity • Motivation: • ATR; Network Centric Warfare; Biometrics; Image Understanding • Active Computations • Achievable Rate Regions • Inner and outer bounds • Successive refinement

  7. System Performance Analysis Resource Allocation Network Resources databases, communications, etc. Performance Estimate Algorithm (ATR) Target Type Estimate Sensors Sensor Data Active Computations Concept • Compute a sequence of inferences and performance estimates (probabilities or reliabilities). • Monitor available resources (time, processors, bandwidth, database, …). • Feed back performance: select next computation; reallocate resources; demand more data.

  8. System Performance Analysis Resource Allocation Network Resources databases, communications, etc. Performance Estimate Algorithm (ATR) Target Type Estimate Sensors Sensor Data Active Computations Concept • Successively refined inferences • Time or resources to achieve performance goal. • Additional data required to achieve performance goal.

  9. Total System Performance Resource Consumption Approximations • ATR system performance entails more than just accuracy: • Time to classify a target • Electrical power dissipation • Sensor engagement, CPU cycles, bits communicated, and other “opportunity costs” • Need real-time estimates of total system performance • Enable informed tradeoff of ATR accuracy with throughput and network resource consumption • Dynamically adapt the system as requirements, capabilities, and operational scenarios evolve

  10. Active Computation • The need to actively manage computations is acute in complex, time-critical environments • Information has a time value • Some information now may be better than a lot of information later, after it is too late to take decisive action • Ideally, we’d like some information now and more later • Static ATR implementations perform the same computations for every image they receive • No tentative answers are available before processing is finished • Availability of more time will not improve the solution accuracy

  11. Active Computation • Dynamic ATR systems employ active computations to maximize the time-value (or resource-value) of information • Approach: Generate a sequence of increasingly accurate classifications • More resources are consumed at every stage • Continue until accuracy is good enough, resource cap is reached, or the result is no longer relevant • Control the computations to maximize the total information value

  12. Active Computation • Maximum-likelihood ATR solution is • Solve a sequence of simpler problems • The functions get closer to with each stage • Each problem is easy to solve given previous solutions • Let be the sequence of problems that are chosen up to stage k • Let be the error and be the resources used • The best strategy at stage K minimizes the total expected cost

  13. Active Computation • Seek heuristic strategies that do not require prior knowledge of K, but are nearly optimal for all K • For example, maximize the expected future increase in likelihood

  14. Recognition System Capacity • Motivation: • ATR; Network Centric Warfare; Biometrics; Image Understanding • Active Computations • Achievable Rate Regions • Inner and outer bounds • Successive refinement

  15. Recognition System Capacity:More Motivation • Number of bits for recognition • Number of patterns that can be distinguished • Number of bits to extract from data • Size of long term memory • Data and processing dependence • Start with simple i.i.d. model

  16. Select one p(y|x) g U1,U2,…,UMx f φ xh X1 X2 . . . XMc X1 X2 . . . XMc p(x) Select one p(h) Rc y memory encoder sensory encoder Rx Ry sensory representation V ˆ memory representation h Objective: Pr{g(V)=h}>1-ε, s.t. R=(Rc, Rx, Ry)

  17. Pattern Recognition Codes and Achievable Rates

  18. Characterizing

  19. Rc=I(X;Y) H(Y) 0 0 I(Y;V) H(X) I(X;U) `Unlimited’ U,V capacity: U=X, Y=V  Rc < I(X;Y) Random channel coding V=Y Rc=I(X;Y)-I(X;Y|U) On the border, U-X-Y-V , so R*=R**=R. U=X Rc=0 Rc=I(X;Y)-I(X;Y|V) Poor memory: U=0  Rc < I(0;V)=0 Poor senses: V=0  Rc<I(U;0)=0. Rc=0 Rx > I(X;U) Ry > I(Y;V) Rc < I(U;V)-I(U;V|X,Y)

  20. p(x,y) X (U,V) Y g f φ A Related Gap: thedistributed source coding problem • - Problem: Characterize the achievable (Rx,Ry,Dx,Dy) • Sergio Servetto claimed solution at ITW 2006 • Solution should transfer to our problem

  21. Related Work

  22. Targeting Info Link • Link Target data to TOC • (type, location, motion,…) Naval Impact/Payoff: The Sensor to Shooter Problem Terminal ATR Weapon Link Shooter Link Sensor Data Link • Strike plan • Target location • ATR parameters • Recce Imagery (SAR, IR, Visible) • Intelligent Bandwidth Compression • Wide Area Cueing (ATC) • ATR • Reference Library, Aimpoint • Select Target • Strike Planning • Weapon Selection Ground Recce/Intel Station Strike Planning System

  23. Convex Hull Inner BoundWestover and O’Sullivan ISIT 2005 • The convex hull of this inner bound is achievable. • Consider theset of all distributions such that conditioned on a random variable Q, we have U – X – Y – V. Then the achievable region is • For every case examined, this convex hull is achieved by time sharing between a length 4 MC and the (0,0,0) point.

  24. Successive Refinement, Two-Stage RecognitionGiven a sequence of (Mx1, My1, Mc1, n) pattern recognition codes, design a sequence of (Mx2, My2, Mc2, n) PR codes with the first sequence as subcodes Mx1≤Mx2 My1 ≤ My2 Mc1 <Mc2 “Up and to the right” Refining Code: (f2,Φ2,g2)n Coarse Code: (f1,Φ1,g1)n The rate sextuplet (Rx1,Ry1,Rc1,Rx2,Ry2,Rc2) is achievable if there exist sequences of recognition codes (f1,Φ1,g1)n and (f2,Φ2,g2)n such that Comment: two different systems (different patterns)

  25. Achievability: Inner Bound • Theorem: Two-stage recognition is achievable if there exist • auxiliary random variables U1, V1, U2, and V2 satisfying • Markov conditions: U1 – X – Y – V1 and • (U1,U2) – X – Y – (V1,V2). • Rate Bounds: • Inner Bound: Proof Sketch. At the coarse stage: • Use the coding strategy for the single-stage pattern recognition system • At the refining stage: • Given memory and sensory indices from the coarse stage, generate “refining” codebooks according to the conditional distributions p(u2|u1(·)) and p(v2|v1(·)). • Encode memory and sensory data with pairs of indices corresponding to coarse and refining stage • Use typical-set decoding to identify pattern

  26. Inner Bound: Successive Refinement • Corollary: Successive refinement is achievable if there exist • auxiliary random variables U1, V1, U2, and V2 satisfying • Markov condition: U1 – U2 – X – Y – V2 – V1. • Rate bounds: Analogous to the Markov condition for successive refinement for rate-distortion. Equitz and Cover, “Successive refinement of information,” IEEE Trans. Info. Theory, Mar. 1991.

  27. Converse: Outer Bound Theorem: If the rate sextuplet (Rx1,Ry1,Rc1,Rx2,Ry2,Rc2)is achievable then there exist auxiliary random variables U1, V1, U2, andV2satisfying • Markov conditions: U1 – X – Y and X – Y – V1 and (U1,U2) – X – Y and X – Y – (V1,V2). • Rate Bounds: Corollary: Two length 4 Markov chains follow: U1 – U2 – X – Y and X – Y – V2 – V1

  28. Extension: Hierarchical Recognition Based on Random Labels • Extend results so that the codebook is the same for the two stages • Randomly label each Xn(k) with a label L(k), out of exp[n(Rc2-Rc1)] labels • For each label, use a (Mx1, My1, Mc1, n) pattern recognition code • Given Yn, run every decoder (for every label)  list of exp[n(Rc2-Rc1)] possible patterns • Use refinement codebooks to determine label and therefore the pattern

  29. p(x|w) p(x|w) p(w) p(x|w) Extension: Hierarchical Recognition Based on Hierarchical Pattern Model • Assume that the patterns are generated by a hierarchical model W X (class  identity) • Inner Bound: U1 – W – Y – V1 and U1 – U2 – X – Y – V2 – V1 • Use a (Mx1, My1, Mc1, n) pattern recognition code to obtain Wn(i)(class) • Use refinement codebooks to determine Xn(i,j) (identity)

  30. Extensions • Inner bounds for prototypical examples: Gaussian, binary. Convex hull is achievable by successive refinement. • Successive refinement “up and to the right”

  31. Recognition System Design Collaborators Washington University Joseph A. O’Sullivan Andrew Li Naveen Singla Po-Hsiang Lai Lee Montagnino Brandon Westover Robert Pless Ronald S. Indeck Natalia A. Schmid (UWVa) Michael D. DeVore (UVa) Alan Van Nevel • Developing robust ATR algorithms, deriving limits on recognition performance • Quantifying recognition performance as a function of system resource measures • Developing algorithms and implementations that adapt to dynamically varying resource constraints • Time, availability of processors, communication bandwidth, data storage, sensor image quality • Impact: increase efficiency and effectiveness of system implementations • Information latency problem • Recognition systems using visual imagery, SAR, ladar • Increase in ATR performance • Allow more imagery to be screened • Provide systematic tools for analyzing design choices such as processors and network communication

  32. Selected Limitations of Existing Systems • “Stovepipe” design • Fixed inputs, processing, database, output • Fixed time • Algorithms are not transparent  Seek “any-time” adaptive system design

  33. Naval Capability Provided “ Network centric warfare is military operations that exploit information and networking technology to integrate widely dispersed human decision makers, situational and targeting sensors, and forces and weapons into a highly adaptive, comprehensive system to achieve unprecedented mission effectiveness.” Network-Centric Naval Forces, Naval Studies Board, National Research Council, 2000 • Active Computations • Exploit technology • Integrate sensors, resource allocation,decision makers, algorithms • Adapt to dynamically varying resources • Provide measures of uncertainty as aa function of available resources