1 / 60

M. Sznaier O. Camps Robust Systems Lab Dept. of Electrical and Computer Eng.

Robust Identification of Hybrid Systems. Compressive Information Extraction. M. Sznaier O. Camps Robust Systems Lab Dept. of Electrical and Computer Eng. Northeastern University. C. Lagoa Dept. of Electrical Eng. Penn State University. TexPoint fonts used in EMF.

irina
Download Presentation

M. Sznaier O. Camps Robust Systems Lab Dept. of Electrical and Computer Eng.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Robust Identification of Hybrid Systems Compressive Information Extraction M. Sznaier O. Camps Robust Systems Lab Dept. of Electrical and Computer Eng. Northeastern University C. Lagoa Dept. of Electrical Eng. Penn State University TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAA

  2. What do these have in common?: Human tracking and activity analysis Tumor detection in low contrast images Detecting gene activity in a diauxic shift In all cases, relevant events comparatively rare and encoded in 1/100 to less than 1/106 of the data

  3. What do these have in common?: Human tracking and activity analysis Tumor detection in low contrast images Claim: A hidden hybrid systems identification problem Detecting gene activity in a diauxic shift

  4. Where should we pay attention?: Features (edges, regions, etc.) are important.

  5. Where should we pay attention?: Dynamics are important too!

  6. Compressive Sensing: • Strong prior: • Signal has a sparse representation only a few ci ≠ 0 • Signal Recovery: • “sparsify” the coefficients • Relax to LP:

  7. Compressive Sensing: Compressive information Extraction • Strong prior: • Actionable information is generated by low complexity dynamical systems. • Information extraction: • “sparsify” the dynamics • Relax to SDP: • Strong prior: • Signal has a sparse representation only a few ci ≠ 0 • Signal Recovery: • “sparsify” the coefficients • Relax to LP:

  8. u y G() Information extraction as an Id problem: features, pixel values, … • Model data streams as outputs of piecewise LTI systems • “Interesting” events  Model invariant(s) changes • “Homogeneous” segments  output of a single LTI sub-system

  9. Piecewise Affine (PWA) Systems Id problem • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Find: • A piecewise affine model such that

  10. Piecewise Affine (PWA) Systems Id problem • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Find: • A piecewise affine model such that Ill posed, always has a trivial solution

  11. Piecewise Affine (PWA) Systems Id problem • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Find: • A piecewise affine model such that • with minimum number of switches systems

  12. PWAS Id problem with min # switches: • Main idea: Non-zero g(t) = SWITCH

  13. PWAS Id problem with min # switches: • Main idea: Min # switches min||g||o A sparsification problem

  14. PWAS Id problem with min # switches: • Formally:

  15. PWAS Id problem with min # switches: • Formally: FACT: “exact” solution tk+1 tk

  16. Example: Video segmentation

  17. PWAS Id problem with fixed # subsystems: Medical Image Segmentation Activity Analysis Need to tell when we are back to the original system

  18. PWAS Id problem with fixed # subsystems: • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Number of sub-models • Find: • A piecewise affine model such that: NP-hard, MILP (Bemporad et. Al.)

  19. PWAS Id problem with fixed # subsystems: • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Number of sub-models • Find: • A piecewise affine model such that: Reduces to a rank minimization problem

  20. PWAS Id problem with fixed # subsystems: • Given: • Bounds on noise (|| η||*·²), sub-system order (no) • Input/output data (u,y) • Number of sub-models • Find: • A piecewise affine model such that: Reduces to a SDP

  21. PWAS Id problem in the noise free case: • GPCA: an algebraic geometric method due to Vidal et al. • Main Idea: Neither the mode signal nor the parameters, b, are known! Independent of mode signal, linear in parameters, c! * = 0

  22. Toy example: 2 first order systems:

  23. Toy example: 2 first order systems:

  24. Toy example: 2 first order systems: Function of the data only System parameters Independent of the data One such equation per data point

  25. PWAS Id problem in the noise free case: • GPCA: an algebraic geometric method due to Vidal et al. • Main Idea: • Solve for cs from the null space of the embedded data matrix. • Get bi from cs via polynomial differentiation Details in Vidal et al., 2003

  26. What happens with noisy measurements? • GPCA: an algebraic geometric method due to Vidal et al. • Main Idea: • η t • Solve for cs from the null space of the embedded data matrix. • Get bi from cs via polynomial differentiation • η • η t T • η to

  27. What happens with noisy measurements? • GPCA: an algebraic geometric method due to Vidal et al. • Main Idea: • η t • Solve for cs from the null space of the embedded data matrix. • Get bi from cs via polynomial differentiation Need to find the null space of a matrix that depends polynomially on the noise. Obvious approach: SVD • η • η t T • η to

  28. Academic Example Noise bound: 0.25

  29. What happens with noisy measurements? • GPCA: an algebraic geometric method due to Vidal et al. • Main Idea: • η t • Solve for cs from the null space of the embedded data matrix. • Get bi from cs via polynomial differentiation Need to find the null space of a matrix that depends polynomially on the noise. Minimize rank Vs w.r.t ηt • η • η t T • η to

  30. From Lasserre 01: Detour: Polynomial Optimization Theorem: (P1) and (P2) are equivalent; that is:

  31. From Lasserre 01: Detour: Polynomial Optimization Theorem: (P1) and (P2) are equivalent; that is:

  32. From Lasserre 01: Detour: Polynomial Optimization Affine in mi

  33. From Lasserre 01: Detour: Polynomial Optimization Affine in mi Hausdorff, Hamburger moments problem. Set of LMIs.

  34. Rank is not a polynomial function. Can we use ideas from polynomial optimization? YES What happens with noisy measurements? Optimization Problem 1:

  35. Rank is not a polynomial function. Can we use ideas from polynomial optimization? YES What happens with noisy measurements? Optimization Problem 1: Optimization Problem 2:

  36. Rank is not a polynomial function. Can we use ideas from polynomial optimization? YES What happens with noisy measurements? Optimization Problem 1: Optimization Problem 2: Convex constraint set!!

  37. Rank is not a polynomial function. Can we use ideas from polynomial optimization? YES What happens with noisy measurements? Optimization Problem 1: • Fact: • There exists a rank deficient solution for Problem 2 if and only if there exists a rank deficient solution for Problem 1. • If c belongs to the nullspace of the solution of Problem 2, there exists a noise value with • such that c belongs to the nullspace of Optimization Problem 2:

  38. Rank is not a polynomial function. Can we use ideas from polynomial optimization? YES What happens with noisy measurements? Optimization Problem 1: • Problem 2 • Matrix rank minimization • Subject to LMI constraints • Use a convex relaxation (e.g. log-det heuristic of Fazelet al.) to solve Problem 2 • Find a vector c in the nullspace • Estimate noise by root finding (Vsc = 0 polynomials of one variable) • Proceed as in noise-free case Optimization Problem 2:

  39. Example: Human Activity Analysis WALK BEND WALK

  40. Model (In)validation of SARX Systems • Given: • A nominal hybrid model of the form: • A bound on the noise (||η||∞ ≤ε) • Experimental Input/Output Data • Determine: • whether there exist noise and switching sequences • consistent with a priori information and experimental data Reduces to SPD via moments and duality Equivalent to checking emptyness of a semialgebraic set

  41. Semi-algebraic Consistency Set

  42. Semi-algebraic Consistency Set • One of the submodels is active at time t (logical OR) * = 0

  43. Semi-algebraic Consistency Set • The model is invalid if and only if • is empty. • Possible to usePositivstellensatz to get invalidation certificates. However, easier to utilize problem structure via moment-based polynomial optimization: Model is invalid iff o*>0

  44. pna + Pna+1 + … + PT Polynomial Optimization • Problem has a sparse structure (running intersection property holds) Details in Lasserre 06 No need to consider all cross moments!

  45. Polynomial Optimization • Problem has a sparse structure (running intersection property holds) • A moments-based relaxation (convergent as N↑): standard relaxation: O((Tny)2N) variables exploiting structure: O((nany)2N) variables Convex SDP!

  46. Example: Activity Monitoring • Set of “normal” activities: walking and waiting • Estimate center of mass with background subtraction • Identified model for walk: • Model for wait: Training sequence for WALK

  47. Example: Activity Monitoring • A priori hybrid model: walking and waiting, 4% noise • Test sequences of hybrid behavior: WALK, WAIT WALK, JUMP RUN Not Invalidated Invalidated Invalidated

  48. Identifying Sparse Dynamical Networks Who is in the same team? Who reacts to whom?

  49. Identifying Sparse Dynamical Networks • Given time series data: • What causes what? (Granger causality) • Are there hidden inputs?

  50. Formalization as a graph id problem: • Given time series data: • What causes what? (Granger causality) • Are there hidden inputs?

More Related