State space models
Download
1 / 47

State Space Models - PowerPoint PPT Presentation


  • 120 Views
  • Uploaded on

State Space Models. Let { x t : t  T } and { y t : t  T } denote two vector valued time series that satisfy the system of equations:. y t = A t x t + v t (The observation equation) x t = B t x t- 1 + u t (The state equation).

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' State Space Models' - kim-johns


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Let { xt:t T} and { yt:t T} denote two vector valued time series that satisfy the system of equations:

yt = Atxt+ vt (The observation equation)

xt = Btxt-1+ ut (The state equation)

The time series { yt:t T} is said to have state-space representation.


Note: { ut:t T} and { vt:t T} denote two vector valued time series that satisfying:

  • E(ut) = E(vt) = 0.

  • E(utusˊ) = E(vtvsˊ) = 0 if t ≠ s.

  • E(ututˊ) = Suand E(vtvtˊ) = Sv.

  • E(utvsˊ) = E(vtusˊ) = 0 for all t and s.


Example: One might be tracking an object with several radar stations. The process {xt:t T} gives the position of the object at time t. The process { yt:t  T} denotes the observations at time t made by the several radar stations.

As in the Hidden Markov Model we will be interested in determining position of the object, {xt:t T}, from the observations, {yt:t T} , made by the several radar stations


Example: Many of the models we have considered to date can be thought of a State-Space models

Autoregressive model of order p:


Define

Then

Observation equation

and

State equation


Hidden Markov Model: Assume that there are m states. Also that there the observations Yt are discreet and take on n possible values.

Suppose that the m states are denoted by the vectors:


Suppose that the n possible observations taken at each state are


Let

and

Note


Let

So that

The State Equation

with


Also

Hence

and

where diag(v) = the diagonal matrix with the components of the vector v along the diagonal


Since

then

and

Thus


We have defined

Hence

Let


Then

The Observation Equation

with

and


Hence with these definitions the state sequence of a Hidden Markov Model satisfies:

The State Equation

with

and

The observation sequence satisfies:

The Observation Equation

with

and


Kalman filtering

Kalman Filtering Markov Model satisfies:


We are now interested in determining the state vector Markov Model satisfies:xt in terms of some or all of the observation vectors y1, y2, y3, … , yT.

We will consider finding the “best” linear predictor.

We can include a constant term if in addition one of the observations (y0 say) is the vector of 1’s.

We will consider estimation of xt in terms of

  • y1, y2, y3, … , yt-1(the prediction problem)

  • y1, y2, y3, … , yt (the filtering problem)

  • y1, y2, y3, … , yT (t < T, the smoothing problem)


For any vector Markov Model satisfies:x define:

where

is the best linear predictor of x(i), the ith component of x, based on y0, y1, y2, … , ys.

The best linear predictor of x(i) is the linear function that of x, based on y0, y1, y2, … , ys that minimizes


Remark Markov Model satisfies:: The best predictor is the unique vector of the form:

Where C0, C1, C2, … ,Cs, are selected so that:


Remark Markov Model satisfies:: If x, y1, y2, … ,ys are normally distributed then:


Remark Markov Model satisfies:

Let u and v, be two random vectors than

is the optimal linear predictor of u based on v if


State space models1

State Space Models Markov Model satisfies:


Let { Markov Model satisfies: xt:t T} and { yt:t T} denote two vector valued time series that satisfy the system of equations:

yt = Atxt+ vt (The observation equation)

xt = Btxt-1+ ut (The state equation)

The time series { yt:t T} is said to have state-space representation.


Note: Markov Model satisfies: { ut:t T} and { vt:t T} denote two vector valued time series that satisfying:

  • E(ut) = E(vt) = 0.

  • E(utusˊ) = E(vtvsˊ) = 0 if t ≠ s.

  • E(ututˊ) = Suand E(vtvtˊ) = Sv.

  • E(utvsˊ) = E(vtusˊ) = 0 for all t and s.


Kalman Filtering: Markov Model satisfies:

Let { xt:t T} and { yt:t T} denote two vector valued time series that satisfy the system of equations:

yt = Atxt+ vt

xt = Bxt-1+ ut

Let

and


Then Markov Model satisfies:

where

One also assumes that the initial vector x0 has mean mand covariance matrix S an that


The covariance matrices are updated Markov Model satisfies:

with


Summary: Markov Model satisfies:The Kalman equations

1.

2.

3.

4.

5.

with

and


Proof Markov Model satisfies::

Now

hence

proving (4)

Note


Let Markov Model satisfies:

Let

Given y0, y1, y2, … , yt-1 the best linear predictor of dt using et is:


Hence Markov Model satisfies:

(5)

where

and

Now


Also Markov Model satisfies:

hence

(2)


Thus Markov Model satisfies:

(4)

(5)

where

(2)

Also


Hence Markov Model satisfies:

(3)

The proof that

(1)

will be left as an exercise.


Example: Markov Model satisfies:

Suppose we have an AR(2) time series

What is observe is the time series

{ut|t  T} and {vt|t  T} are white noise time series with standard deviations suand sv.



The equation: defining:

can be written

Note:


The Kalman equations defining:

1.

2.

3.

4.

5.

Let


The Kalman equations defining:

1.


2. defining:


3. defining:


4. defining:


5. defining:


Kalman Filtering (smoothing): defining:

Now consider finding

These can be found by successive backward recursions for t = T, T – 1, … , 2, 1

where



The defining:backward recursions

2.

1.

3.

In the example:

- calculated in forward recursion


ad