615 project real time monocular vision based slam n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
615 Project: Real-time monocular vision-based SLAM PowerPoint Presentation
Download Presentation
615 Project: Real-time monocular vision-based SLAM

Loading in 2 Seconds...

play fullscreen
1 / 27

615 Project: Real-time monocular vision-based SLAM - PowerPoint PPT Presentation


  • 192 Views
  • Uploaded on

615 Project: Real-time monocular vision-based SLAM. Adam Rachmielowski. Overview. SFM and SLAM Extended Kalman filter Visual SLAM details Results Next. Estimating structure and motion. Factorization [Tomasi & Kanade ’92] Batch method Efficient Originally for affine camera

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '615 Project: Real-time monocular vision-based SLAM' - fredrica-justin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
overview
Overview
  • SFM and SLAM
  • Extended Kalman filter
  • Visual SLAM details
  • Results
  • Next
estimating structure and motion
Estimating structure and motion
  • Factorization [Tomasi & Kanade ’92]
    • Batch method
    • Efficient
    • Originally for affine camera
    • Missing data?
    • Finite camera [Sturm & Triggs]

W = MX

estimating structure and motion1
Estimating structure and motion
  • Reconstruction from N views [Hartley & Zisserman ’00]
    • Multiview geomteric entities and algorithms described by Faugeras, H, Z, and others
    • Minimize global error with bundle adjustment
    • Can be used sequentially
    • Upgrade to Euclidean with auto calibration

xFPX

slide5
SLAM
  • Simultaneous Localisation And Mapping
  • Estimate robot’s pose and map feature positions
  • Probabilistic framework maintains
    • current estimate
    • estimate uncertainty (covariance)
  • Update based on measurements and model
  • Many systems use
    • odometry and active sensors as measurement devices
    • limited motion models
vision based slam
Vision-based SLAM
  • Camera for measurements
  • Trinocular
    • 3D measurements by triangulation
    • Offline [Ayache, Faugeras ’89]
    • Real-time with SIFTs [Se, Lowe, Little ’01]
  • Real-time monocular [Chiuso et al. ’00]
kalman filter swerling 58 welch bishop 01
Kalman filter [Swerling ’58][Welch, Bishop ’01]
  • Estimates state of dynamic system
  • Integrates noisy measurements to give optimal estimate
  • Noise is Gaussian
  • First order Markov process
kf key variables
KF: key variables
  • estimate of state at time k
  • error covariance (estimate uncertainty)
  • state transition function
  • measurement
  • state to measure
  • noise covariances
kf two phase estimation
Predict

Predicted state

Predicted covariance

KF: Two phase estimation
kf two phase estimation1
Update

Innovation

Innov. covar.

Kalman gain

State

Covariance

KF: Two phase estimation
ekf extended kalman filter
EKF: Extended Kalman filter
  • Allow non-linear functions (F, H)
  • Apply functions to state
  • Apply jacobian to covariances
  • Linearizing functions around current estimate
visual slam details davison 03
Visual SLAM details [Davison ’03]
  • State representation x, P
  • Process model F (motion)
  • Measurement model H (projection)
  • State update
  • System initialization
  • Adding and removing feature
state representation
State representation
  • Scene structure (feature points)
    • Depth from reference image [Azarbayejani, Pentland ’95]
    • x,y,z coordinates
  • Camera
    • Pose
    • Motion
state estimate vector
State estimate vector
  • Points yi
  • Camera xv
    • 6DOF pose
    • Constant velocity motion model
    • Acceleration modeled as noise
covariance matrix
Covariance matrix
  • Covariance blocks
    • Pxx camera params
    • Pyiyi point I
  • Off diagonals represent correlation between estimates
process model
Process model
  • Points don’t move: yk = yk-1
  • Add velocity and acceleration to current camera parameters
  • Covariance updated using jacobian
measurement model
Measurement model
  • H models projection of the predicted points by the predicted camera
  • Covariance Si guides feature match search
making measurements update
Making measurements / Update
  • Project innovation covariance to search ellipse
  • Warp template based on camera and point prediction
  • If viewing angle is good, match to get measurement
  • Compute Kalman gain and update state and covariance
system initialization
System initialization
  • Need initial estimate and covariance
    • Calibration object
    • SFM
  • Process covariance
    • Small: small searches, but can only handle small accelerations
    • Large: can handle big accelerations, but need many measurements
  • Measurement covariance
    • Function of matching method (camera resolution)
adding and removing features
Adding and removing features
  • Add
    • Select salient feature in desired region
    • Search along epipolar line
  • Remove
    • If matching repeatedly fails

Davison ’03

preliminary results
Preliminary results
  • Simulation [implemented with Birkbeck]
    • Behaves according to model
    • Initial estimate of camera and 4 key points is true value + small amount of noise
    • Initial estimate of other points is true value + significant noise
    • Initial covariance is scaled identity
slide25
Next
  • Real images (video sequence)
    • Feature matching
    • Tracking
    • SIFTs ?
  • Real-time issues
    • Postponement [Davison ’01]
  • Loop closing
    • Davison’s system automatically corrects if feature becomes visible and is correctly measured, but…
    • Prevent drift by incorporating explicit loop closing [Newman, Ho ’05]