1 / 61

Motion estimation from image and inertial measurements

Motion estimation from image and inertial measurements. Dennis Strelow and Sanjiv Singh. On the web. Related materials: these slides related papers movies VRML models at: http://www.cs.cmu.edu/~dstrelow/northrop. Introduction (1). micro air vehicle (MAV) navigation.

Download Presentation

Motion estimation from image and inertial measurements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Motion estimation from image and inertial measurements Dennis Strelow and Sanjiv Singh

  2. On the web Related materials: • these slides • related papers • movies • VRML models at: http://www.cs.cmu.edu/~dstrelow/northrop

  3. Introduction (1) micro air vehicle (MAV) navigation AeroVironment Black Widow AeroVironment Microbat

  4. Introduction (2) mars rover navigation Mars Exploration Rovers (MER) Hyperion

  5. Introduction (3) robotic search and rescue Center for Robot-Assisted Search and Rescue, U. of South Florida Rhex

  6. Introduction (4) NASA ISS personal satellite assistant

  7. Introduction (5) Each of these problems requires: • 6 DOF motion • in unknown environments • without GPS or other absolute positioning • over the long term …and some of the problems require: • small, light, and cheap sensors

  8. Introduction (6) Monocular, image-based motion estimation is a good candidate In particular, simultaneous estimation of: • multiframe motion • sparse scene structure is the most promising approach

  9. Outline Image-based motion estimation Improving estimation Improving feature tracking Reacquisition

  10. Outline Image-based motion estimation refresher difficulties Improving estimation Improving feature tracking Reacquisition

  11. Image-based motion estimation: refresher (1) A two-step process is typical… First, sparse feature tracking: • Inputs: raw images • Outputs: projections

  12. Image-based motion estimation: refresher (2)

  13. Image-based motion estimation: refresher (3) Second, estimation: • Input: • Outputs: • projections from tracker • 6 DOF camera position at the time of each image • 3D position of each tracked point

  14. Image-based motion estimation: refresher (4)

  15. Image-based motion estimation: refresher (5) Algorithms exist For tracking: • Lucas-Kanade (Lucas and Kanade, 1981)

  16. Image-based motion estimation: refresher (6) For estimation: • SVD-based factorization (Tomasi and Kanade, 1992) • bundle adjustment (various, 1950’s) • Kalman filtering (Broida and Chellappa, 1990) • variable state dimension filter (McLauchlan, 1996)

  17. Image-based motion estimation: difficulties (1) So, the problem is solved?

  18. Image-based motion estimation: difficulties (2) • If so, where are the automatic systems for estimating the motion of: • in unknown environments? • from images in unknown environments?

  19. Image-based motion estimation: difficulties (3) …and for automatically modeling • rooms • buildings • cities from a handheld camera?

  20. Image-based motion estimation: difficulties (4) Estimation step can be very sensitive to: • incorrect or insufficient image feature tracking • camera modeling and calibration errors • outlier detection thresholds • sequences with degenerate camera motions

  21. Image-based motion estimation: difficulties (5) …and for recursive methods in particular: • poor prior assumptions on the motion • poor approximations in state error modeling

  22. Image-based motion estimation: difficulties (6) • 151 images, 23 points

  23. Image-based motion estimation: difficulties (7)

  24. Outline Image-based motion estimation Improving estimation overview image and inertial measurements Improving feature tracking Reacquisition

  25. Improving estimation: overview

  26. Improving estimation: overview

  27. Improving estimation: image and inertial (1) Image and inertial measurements are highly complimentary Inertial measurements can: • resolve the ambiguities in image-only estimates • establish the global scale

  28. Improving estimation: image and inertial (2) Images measurements can: • reduce the drift in integrating inertial measurements • distinguish between rotation, gravity, acceleration, bias, noise in accelerometer readings

  29. Improving estimation: image and inertial (3)

  30. Improving estimation: image and inertial (4)

  31. Improving estimation: image and inertial (5) • Other examples: • global scale typically within 5% • better convergence than image-only estimation

  32. Improving estimation: image and inertial (6) Many more details in: Dennis Strelow and Sanjiv Singh. Motion estimation from image and inertial measurements. International Journal of Robotics Research, to appear.

  33. Outline Image-based motion estimation Improving estimation Improving feature tracking Lucas-Kanade Lucas-Kanade and real sequences The “smalls” tracker Reacquisition

  34. Improving feature tracking: Lucas-Kanade (1) • Lucas-Kanade has been the go-to feature tracker from shape-from-motion • iteratively minimize the intensity matching error… • …with respect to the feature’s position in the new image

  35. Improving feature tracking: Lucas-Kanade (2) Additional heuristics used to apply Lucas-Kanade to shape-from-motion:

  36. Improving feature tracking: Lucas-Kanade (3) • Lucas-Kanade advantages: • fast • subpixel resolution • can handle some large motions well • uses general minimization, so easily extendible

  37. Improving feature tracking: Lucas-Kanade (4) 0.1 average pixel reprojection error!

  38. Improving feature tracking: Lucas-Kanade and real sequences (1) But Lucas-Kanade performs poorly on many real sequences…

  39. Improving feature tracking: Lucas-Kanade and real sequences (2) …and image-based motion estimation can be sensitive to errors in feature tracking

  40. Improving feature tracking: Lucas-Kanade and real sequences (3)

  41. Improving feature tracking: Lucas-Kanade and real sequences (4)

  42. Improving feature tracking: Lucas-Kanade and real sequences (5)

  43. Improving feature tracking: Lucas-Kanade and real sequences (6) • Why does Lucas-Kanade perform poorly on many real sequences? • the heuristics are poor • the features are tracked independently

  44. Improving feature tracking: the “smalls” tracker (1) • smalls is a new feature tracker for shape-from-motion and similar applications • eliminates the heuristics normally used with Lucas-Kanade • enforces the rigid scene constraint

  45. Improving feature tracking: the “smalls” tracker (2) Leonard Smalls; tracker, manhunter

  46. Improving feature tracking: the “smalls” tracker (3) epipolar geometry 1-D matching along epipolar lines geometric mistracking detection feature death and birth to 6 DOF output estimation features

  47. Improving feature tracking: the “smalls” tracker (3) SIFT epipolar geometry features 1-D matching along epipolar lines geometric mistracking detection feature death and birth to 6 DOF output estimation features

  48. Improving feature tracking: the “smalls” tracker (4) • SIFT keypoints (Lowe, IJCV 2004): • image interest points • can be extracted despite of large changes in viewpoint • to subpixel accuracy • A keypoint’s feature vectors in two images usually match

  49. Improving feature tracking: the “smalls” tracker (5) Epipolar geometry between adjacent images is determined using… SIFT epipolar geometry features • SIFT extraction and matching • two-frame bundle adjustment • RANSAC

  50. Improving feature tracking: the “smalls” tracker (6) 1-D matching along epipolar lines • Search for new feature locations constrained to epipolar lines: • initial position from nearby SIFT matches • discrete SSD search (e.g.,  60 pixels) • 3. 1-D Lucas-Kanade refines the match

More Related