slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Globally Optimal Estimates for Geometric Reconstruction Problems PowerPoint Presentation
Download Presentation
Globally Optimal Estimates for Geometric Reconstruction Problems

Loading in 2 Seconds...

play fullscreen
1 / 79

Globally Optimal Estimates for Geometric Reconstruction Problems - PowerPoint PPT Presentation


  • 109 Views
  • Uploaded on

Globally Optimal Estimates for Geometric Reconstruction Problems. Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics and Computer Science Weizmann Institute 3 June 2007. outline. Motivation and Introduction Background

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

Globally Optimal Estimates for Geometric Reconstruction Problems


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
    Presentation Transcript
    1. Globally Optimal Estimates for Geometric ReconstructionProblems Tom Gilat, Adi Lakritz Advanced Topics in Computer Vision Seminar Faculty of Mathematics and Computer Science Weizmann Institute 3 June 2007

    2. outline • Motivation and Introduction • Background • Positive SemiDefinite matrices (PSD) • Linear Matrix Inequalities (LMI) • SemiDefinite Programming (SDP) • Relaxations • Sum Of Squares (SOS) relaxation • Linear Matrix Inequalities (LMI) relaxation • Application in vision • Finding optimal structure • Partial relaxation and Schur’s complement

    3. Motivation Geometric Reconstruction Problems Polynomial optimization problems (POPs)

    4. 2-views – exact solution Multi view - optimization Triangulation problem in L2 norm

    5. y z x Triangulation problem in L2 norm perspective camera i err

    6. Triangulation problem in L2 norm minimize reprojection error in all cameras Polynomial minimization problem Non convex

    7. Calculating homography given 3D points on a plane and corresponding image points, calculate homography More computer vision problems • Reconstruction problem: known cameras, known corresponding points find 3D points that minimize the projection error of given image points • Similar to triangulation for many points and cameras • Many more problems

    8. Optimization problems

    9. Introduction to optimization problems

    10. optimization problems NP - complete

    11. optimization problems optimization solutions exist: interior point methods problems: local optimum or high computational cost convex non convex Linear Programming (LP) SemiDefinite Programming (SDP)

    12. non convex optimization Non convex feasible set Many algorithms Get stuck in local minima init Max Min level curves of f

    13. optimization problems optimization solutions exist: interior point methods problems: local optimum or high computational cost convex non convex relaxation of problem LP SDP global optimization – algorithms that converge to optimal solution

    14. outline • Motivation and Introduction • Background • Positive SemiDefinite matrices (PSD) • Linear Matrix Inequalities (LMI) • SemiDefinite Programming (SDP) • Relaxations • Sum Of Squares (SOS) relaxation • Linear Matrix Inequalities (LMI) relaxation • Application in vision • Finding optimal structure • Partial relaxation and Schur’s complement

    15. positive semidefinite (PSD) matrices Definition: a matrix M in Rn×n is PSD if 1. M is symmetric: M=MT 2. for all M can be decomposed as AAT (Cholesky) Proof:

    16. positive semidefinite (PSD) matrices Definition: a matrix M in Rn×n is PSD if 1. M is symmetric: M=MT 2. for all M can be decomposed as AAT (Cholesky)

    17. principal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

    18. diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

    19. diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order:

    20. diagonal minors The kth order principal minors of an n×n symmetric matrix M are the determinants of the k×k matrices obtained by deleting n - k rows and the corresponding n - k columns of M first order: elements on diagonal second order: third order: det(M)

    21. Set of PSD matrices in 2D

    22. Set of PSD matrices This set is convex Proof:

    23. LMI – linear matrix inequality

    24. LMI example: find the feasible set of the 2D LMI

    25. reminder

    26. LMI example: find the feasible set of the 2D LMI 1st order principal minors

    27. LMI example: find the feasible set of the 2D LMI 2nd order principal minors

    28. LMI example: find the feasible set of the 2D LMI 3rd order principal minors Intersection of all inequalities

    29. Semidefinite Programming (SDP) = LMI an extension of LP

    30. outline • Motivation and Introduction • Background • Positive SemiDefinite matrices (PSD) • Linear Matrix Inequalities (LMI) • SemiDefinite Programming (SDP) • Relaxations • Sum Of Squares (SOS) relaxation • Linear Matrix Inequalities (LMI) relaxation • Application in vision • Finding optimal structure • Partial relaxation and Schur’s complement

    31. Sum Of Squares relaxation (SOS) Unconstrained polynomial optimization problem (POP) means the feasible set is Rn H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.

    32. Sum Of Squares relaxation (SOS)

    33. SOS relaxation for unconstrained polynomials

    34. SOS relaxation for unconstrained polynomials

    35. monomial basis example

    36. SOS relaxation to SDP SDP

    37. SOS relaxation to SDP example: SDP

    38. SOS for constrained POPs possible to extend this method for constrained POPs by use of generalized Lagrange dual

    39. SOS relaxation summary SOS relaxation SDP POP SOS problem Global estimate So we know how to solve a POP that is a SOS And we have a bound on a POP that is not an SOS H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.

    40. relaxations SOS: SOS relaxation SDP POP SOS problem Global estimate LMI: LMI relaxation SDP + converge POP linear & LMI problem Global estimate

    41. outline • Motivation and Introduction • Background • Positive SemiDefinite matrices (PSD) • Linear Matrix Inequalities (LMI) • SemiDefinite Programming (SDP) • Relaxations • Sum Of Squares (SOS) relaxation • Linear Matrix Inequalities (LMI) relaxation • Application in vision • Finding optimal structure • Partial relaxation and Schur’s complement

    42. LMI relaxations Constraints are handled Convergence to optimum is guaranteed Applies to all polynomials, not SOS as well

    43. A maximization problem Note that: • Feasible set is non-convex. • Constraints are quadratic Feasible set

    44. LMI – linear matrix inequality, a reminder

    45. Motivation An SDP: Goal SDP with solution close to global optimum of the original problem Polynomial Optimization Problem What is it good for? SDP problems can be solved much more efficiently then general optimization problems.

    46. LMI Relaxations is iterative process LMI: POP Step 1: introduce new variables Apply higher order relaxations Linear + LMI + rank constraints Step 2: relax constraints SDP

    47. LMI relaxation – step 1 (the R2 case) Replace monomials by “lifting variables” Rule: Example:

    48. Introducing lifting variables Lifting

    49. New problem is linear, in particular convex Not equivalent to the original problem. Lifting variables are not independent in the original problem:

    50. Goal, more specifically Linear problem (obtained by lifing) + “relations constraints” on lifting variables Relaxation SDP