state estimation and kalman filtering n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
State Estimation and Kalman Filtering PowerPoint Presentation
Download Presentation
State Estimation and Kalman Filtering

Loading in 2 Seconds...

play fullscreen
1 / 26

State Estimation and Kalman Filtering - PowerPoint PPT Presentation


  • 109 Views
  • Uploaded on

State Estimation and Kalman Filtering. CS B659 Spring 2013 Kris Hauser. Motivation. Observing a stream of data Monitoring (of people, computer systems, etc ) Surveillance, tracking Finance & economics Science Questions: Modeling & forecasting Handling partial and noisy observations.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'State Estimation and Kalman Filtering' - zarita


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
state estimation and kalman filtering

State Estimation and Kalman Filtering

CS B659

Spring 2013

Kris Hauser

motivation
Motivation
  • Observing a stream of data
    • Monitoring (of people, computer systems, etc)
    • Surveillance, tracking
    • Finance & economics
    • Science
  • Questions:
    • Modeling & forecasting
    • Handling partial and noisy observations
markov chains
Markov Chains

X0

X1

X2

X3

Observe X1

X0 independent of X2, X3, …

P(Xt|Xt-1) known as transition model

Sequence of probabilistic state variables X0,X1,X2,…

E.g., robot’s position, target’s position and velocity, …

inference in mc
Inference in MC

[Incremental approach]

  • Prediction: the probability of future state?
    • P(Xt) = Sx0,…,xt-1P (X0,…,Xt) = Sx0,…,xt-1P (X0) Px1,…,xt P(Xi|Xi-1)= Sxt-1P(Xt|Xt-1) P(Xt-1)
  • “Blurs” over time, and approaches stationary distribution as t grows
    • Limited prediction power
    • Rate of blurring known as mixing time
modeling partial observability

X0

X1

X2

X3

Modeling Partial Observability

Hidden state variables

Observed variables

O1

O2

O3

P(Ot|Xt) called the observation model (or sensor model)

Hidden Markov Model (HMM)

filtering
Filtering

Query variable

Unknown

X0

X1

X2

Distribution given

O1

O2

  • Name comes from signal processing
  • Goal: Compute the probability distribution over current state given observations up to this point

Known

filtering1
Filtering

Query variable

Unknown

X0

X1

X2

Distribution given

O1

O2

  • Name comes from signal processing
  • Goal: Compute the probability distribution over current state given observations up to this point
  • P(Xt|o1:t) = Sxt-1P(xt-1|o1:t-1)P(Xt|xt-1,ot)
  • P(Xt|Xt-1,ot) = P(ot|Xt-1,Xt)P(Xt|Xt-1)/P(ot|Xt-1) = a P(ot|Xt)P(Xt|Xt-1)

Known

kalman filtering
Kalman Filtering
  • In a nutshell
    • Efficient probabilistic filtering in continuous state spaces
    • Linear Gaussian transition and observation models
  • Ubiquitous for state tracking with noisy sensors, e.g. radar, GPS, cameras
hidden markov model for robot localization

X0

X1

X2

X3

Hidden Markov Model for Robot Localization
  • Use observations + transition dynamics to get a better idea of where the robot is at time t

Hidden state variables

Observed variables

z1

z2

z3

Predict – observe – predict – observe…

hidden markov model for robot localization1

X0

X1

X2

X3

Hidden Markov Model for Robot Localization
  • Use observations + transition dynamics to get a better idea of where the robot is at time t
  • Maintain a belief state bt over time
    • bt(x) = P(Xt=x|z1:t)

Hidden state variables

Observed variables

z1

z2

z3

Predict – observe – predict – observe…

bayesian filtering with belief states
Bayesian Filtering with Belief States
  • Compute bt, given ztand prior belief bt
  • Recursive filtering equation
bayesian filtering with belief states1
Bayesian Filtering with Belief States
  • Compute bt, given ztand prior belief bt
  • Recursive filtering equation

Predict P(Xt|z1:t-1) using dynamics alone

Update via the observation zt

in continuous state spaces
In Continuous State Spaces…
  • Compute bt, given zt and prior belief bt
  • Continuous filtering equation
in continuous state spaces1
In Continuous State Spaces…
  • Compute bt, given zt and prior belief bt
  • Continuous filtering equation
  • How to evaluate this integral?
  • How to calculate Z?
  • How to even represent a belief state?
key representational decisions
Key Representational Decisions
  • Pick a method for representing distributions
    • Discrete: tables
    • Continuous: fixed parameterized classes vs. particle-based techniques
  • Devise methods to perform key calculations (marginalization, conditioning) on the representation
    • Exact or approximate?
gaussian distribution
Gaussian Distribution
  • Mean m, standard deviation s
  • Distribution is denoted N(m,s)
  • If X ~ N(m,s), then
  • With a normalization factor
linear gaussian transition model for moving 1d point
Linear Gaussian Transition Model for Moving 1D Point
  • Consider position and velocity xt, vt
  • Time step h
  • Without noise xt+1 = xt + h vtvt+1 = vt
  • With Gaussian noise of std s1

P(xt+1|xt)  exp(-(xt+1 – (xt + h vt))2/(2s12)

i.e. Xt+1 ~ N(xt + h vt, s1)

linear gaussian transition model

vh

s1

Linear Gaussian Transition Model
  • If prior on position is Gaussian, then the posterior is also Gaussian

N(m,s)  N(m+vh,s+s1)

linear gaussian observation model
Linear Gaussian Observation Model
  • Position observation zt
  • Gaussian noise of std s2

zt ~ N(xt,s2)

linear gaussian observation model1

Observation probability

Linear Gaussian Observation Model
  • If prior on position is Gaussian, then the posterior is also Gaussian

Posterior probability

Position prior

  •  (s2z+s22m)/(s2+s22)

s2s2s22/(s2+s22)

multivariate gaussians
Multivariate Gaussians

X~ N(m,S)

  • Multivariate analog in N-D space
  • Mean (vector) m, covariance (matrix) S
  • With a normalization factor
multivariate linear gaussian process
Multivariate Linear Gaussian Process
  • A linear transformation + multivariate Gaussian noise
  • If prior state distribution is Gaussian, then posterior state distribution is Gaussian
  • If we observe one component of a Gaussian, then its posterior is also Gaussian

y = A x + e

e ~ N(m,S)

multivariate computations
Multivariate Computations
  • Linear transformations of gaussians
    • If x ~ N(m,S), y = A x + b
    • Then y ~ N(Am+b, ASAT)
  • Consequence
    • If x ~ N(mx,Sx), y ~ N(my,Sy), z=x+y
    • Then z ~ N(mx+my,Sx+Sy)
  • Conditional of gaussian
    • If [x1,x2] ~ N([m1 m2],[S11,S12;S21,S22])
    • Then on observing x2=z, we havex1 ~ N(m1-S12S22-1(z-m2), S11-S12S22-1S21)
next time
Next time
  • Principles Ch. 9
  • Rekleitis (2004)