Systems identification
This presentation is the property of its rightful owner.
Sponsored Links
1 / 44

SYSTEMS Identification PowerPoint PPT Presentation


  • 69 Views
  • Uploaded on
  • Presentation posted in: General

SYSTEMS Identification. Ali Karimpour Assistant Professor Ferdowsi University of Mashhad. Lecture 9. Asymptotic Distribution of Parameter Estimators. Topics to be covered include : Central Limit Theorem The Prediction-Error approach: Basic theorem Expression for the Asymptotic Variance

Download Presentation

SYSTEMS Identification

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Systems identification

SYSTEMSIdentification

Ali Karimpour

Assistant Professor

Ferdowsi University of Mashhad


Lecture 9

Lecture 9

Asymptotic Distribution of Parameter Estimators

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Overview

Overview

If convergence is guaranteed, then

But, how fast does the estimate approach the limit?

What is the probability distribution of ?

  • The variance analysis of this chapter will reveal:

  • a) The estimate converges to at a rate proportional to

  • b) Distribution converges to a Gaussian distribution: N(0,Q)

  • c) Covariance matrix Q, depends on

  • - The number of samples/data set size: N,

    • - The parameter sensitivity of the predictor:

    • - The noise variance


Central limit theorem

Central Limit Theorem

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Systems identification

Central Limit Theorem

The mathematical tool needed for asymptotic variance analysis is “Central Limit” theorems.

Example: Consider two independent random variable, X and Y, with the same uniform distribution, shown in Figure below.

Define another random variable Z as the sum of X and Y: Z=X+Y. we can obtain the distribution of Z, as :


Systems identification

Central Limit Theorem

Further, consider W=X+Y+Z. The resultant PDF is getting close to a

Gaussian distribution

The resultant PDF is getting close to a Gaussian distribution.

In general, the PDF of a random variable approaches a Gaussian

distribution, regardless of the PDF of each , as N gets larger.


Systems identification

Central Limit Theorem

Let be a d-dimensional random variable with :

Mean

Cov

Consider the sum of given by:

Then, as N tends to infinity, the distribution of converges to the

Gaussian distribution given by PDF:


The prediction error approach basic theorem

The Prediction-Error approach: Basic Theorem

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


The prediction error approach

The Prediction-Error Approach

Applying the Central Limit Theorem, we can obtain the distribution of estimate

as N tends to infinity.

Let be an estimate based on the prediction error method

Then, with prime denoting differentiation with respect to ,

Expanding around gives:

is a vector “between”


The prediction error approach1

The Prediction-Error Approach

Assume that is nonsingular, then:

To obtain the distribution of , and must be

computed as N tends to infinity.

Where as usual:


The prediction error approach2

The Prediction-Error Approach

For simplicity, we first assume that the predictor is given by a linear regression:

The actual data is generated by (is the parameter vector of the true system)

So:

Therefore:


Systems identification

The Prediction-Error Approach

Let us treat as a random variable. Its mean is zero, since:

The covariance is

Consider:

Appling the central limit Theorem:


The prediction error approach3

The Prediction-Error Approach

Next, compute :

And:


The prediction error approach4

The Prediction-Error Approach

We obtain:

So:


The prediction error approach5

The Prediction-Error Approach

The extended result of estimate distribution is summarized in the following

theorem, i.e. Ljun’g Textbook Theorem 9-1.

Theorem 1 Consider the estimate determined by:

Assume that the model structure is linear and uniformly stable and that the

data setsatisfies the quasi stationary requirements. Assume also that

converges with probability 1 to a unique parameter vector involved in :

Also we have:

And that:

Converge to with probability 1


The prediction error approach6

The Prediction-Error Approach

Where is the ensemble mean given by:

Then, the distribution of converges to the Gaussian

distribution given by


The prediction error approach7

The Prediction-Error Approach

As stated formally in Theorem 1, the distribution of converges

to a Gaussian distribution for the broad class of system identification problems.

This implies that the covariance of asymptotically converges to:

  • This is called the asymptotic covariance matrix and it depends not only on

    • (a) the number of samples/data set size: N, but also on

    • (b) the parameter sensitivity of the predictor:

(c) Noise variance


Expression for the asymptotic variance

Expression for the Asymptotic Variance

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Systems identification

Quadratic Criterion

Let us compute the covariance once again for the general case:

Unlike the linear regression, the sensitivity is a function of θ,

Assume that is a white noise with zero mean

and variances . We have:


Systems identification

Quadratic Criterion

Similarly

Hence:

The asymptotic variance is therefore

a) inversely proportional to the number of samples, b) proportional to the noise variance, and c) inversely related to the parameter sensitivity.

The more a parameter affects the prediction, the smaller the variance becomes.


Systems identification

Quadratic Criterion

Since is not known, the asymptotic variance cannot be determined.

A very important and useful aspect of expressions for the asymptotic covariance matrix is that it can be estimated from data. Having N data points and determined we mat use:

sufficient data samples needed for assuming the model accuracy may be obtained.


Systems identification

Example : Covariance of LS Estimates

Consider the system

and are two independent white noise with variances and

respectively

Suppose that the coefficient for is known and the system is identified

in the model structure

Or

We have:


Systems identification

Example : Covariance of LS Estimates

Hence

To compute the covariance, square the first equation and take the expectation:

Multiplying the first equation by and taking expectation gives:

The last equality follows, since does not affect (due to the time delay )

Hence:


Systems identification

Example : Covariance of LS Estimates

Assume a=0.1,

Estimated values for parameter a, for 100 independent experiment using LSE, is shown in the bellow Figure.

Cov (a) = 0.0471


Systems identification

Example : Covariance of LS Estimates

Now, assume a=0.1,

Cov (a) = 0.0014


Systems identification

Example : Covariance of LS Estimates

Now, assume a=0.1,

Cov (a) = 0.1204


Systems identification

Example : Covariance of an MA(1) Parameter

Consider the system

is white noise with variance . The MA(1) model structure is used:

Given the predictor 4.18:

Differentiation w.r.t c gives

At we have :

If is the PEM estimate of c:


Systems identification

  • Asymptotic Variance for general Norms.

For general Norm

We have:

Similarly:

We can use the asymptotic normality result in this more general form whenever required. The expression for the asymptotic covariance matrix is rather complicated in general.


Systems identification

  • Asymptotic Variance for general Norms.

Assume that . Then under assumption after straightforward

calculations:

Clearly for for quadratic

The choice of in the criterion only acts as scaling of the covariance matrix


Frequency domain expressions for the asymptotic variance

Frequency-Domain Expressions for the Asymptotic Variance

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Systems identification

  • Frequency-Domain Expressions for the Asymptotic Variance.

The asymptotic variance has different expression in the frequency domain,

which we will find useful for variance analysis and experiment design.

Let transfer function and noise model be consolidated into a matrix

The gradient of T, that is, the sensitivity of T to θ, is

For a predictor, we have already defined W(q,θ,) and z(t), s.t.


Systems identification

  • Frequency-Domain Expressions for the Asymptotic Variance.

Therefore the predictor sensitivity is given by

Where

Substituting in the first equation:


Systems identification

  • Frequency-Domain Expressions for the Asymptotic Variance.

At (the true system), note and

where

Let be the spectrum matrix of :

Using the familiar formula:


Systems identification

  • Frequency-Domain Expressions for the Asymptotic Variance.

For the noise spectrum,

Using this in equation below:

We have:

The asymptotic variance in the frequency domain.


Distribution of estimation for the correlation approach

Distribution of Estimation for the correlation Approach

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Systems identification

  • The Correlation Approach

We shall confine ourselves to the case study in Theorem 8.6, that is,

and linearly generated instruments. We thus have:

By Taylor expansion we have:

This is entirely analogous with the previous one obtained for PE approach, with he difference that in is replaced with in .


Systems identification

  • The Correlation Approach

Theorem : consider by

Assume that is computed for a linear, uniformly stable model structure

And that:

is a uniformly stable family of filters.

Assume also that that is nonsingular and that

Then


Systems identification

  • The Correlation Approach


Systems identification

Example : Covariance of an MA(1) Parameter

Consider the system

is white noise with variance . Using PLR method:

At we have :


Distribution of estimation for the instrumental variable methods

Distribution of Estimation for the Instrumental Variable Methods

Topics to be covered include:

  • Central Limit Theorem

  • The Prediction-Error approach: Basic Theorem

  • Expression for the Asymptotic Variance

  • Frequency-Domain Expressions for the Asymptotic Variance

  • Distribution of Estimation for the correlation Approach

  • Distribution of Estimation for the Instrumental Variable Methods


Systems identification

  • Instrumental Variable Methods

We have:

Suppose the true system is given as

Where e(t) is white noise with variance independent of {u(t)}. then

is independent of {u(t)} and hence of if the system operates in

open loop. Thus is a solution to:


Systems identification

  • Instrumental Variable Methods

To get an asymptotic distribution, we shall assume it is the only solution to .

Introduce also the monic filter

Intersecting into these Eqs.,


Systems identification

Example : Covariance of LS Estimates

Consider the system

and are two independent white noise with variances and

respectively

Suppose that the coefficient for is known and the system is identified

in the model structure.

Let a be estimated by the IV method using:

By comparing the above system with

We have


Systems identification

  • Instrumental Variable Methods


  • Login