Modeling methods
This presentation is the property of its rightful owner.
Sponsored Links
1 / 49

Modeling methods PowerPoint PPT Presentation


  • 84 Views
  • Uploaded on
  • Presentation posted in: General

Modeling methods. Used in system identification and in MRAS. Linear in parameter models: (Models for describing linear systems). Auto-Regressive Model (AR model) Moving Average Model (MA model) Finite Impulse Response Model (FIR model)

Download Presentation

Modeling methods

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Modeling methods

Modeling methods

Used in system identification and in MRAS


Linear in parameter models models for describing linear systems

Linear in parameter models: (Models for describing linear systems)

  • Auto-Regressive Model (AR model)

  • Moving Average Model (MA model)

  • Finite Impulse Response Model (FIR model)

  • Auto-Regressive model with extraneous (extra) input (ARX model)

  • Auto-Regressive Moving Average model (ARMA model)

  • Auto-Regressive Moving Average model with extraneous input (ARMAX model)

  • Auto-Regressive Integrated Moving Average model (ARIMA model)

  • Auto-Regressive Integrated Moving Average model with extraneous input (ARIMAX

  • model)


Contd

Contd..

  • Each of the above models has a way of describing the relationship between the input, output and error.

  • Some models have much freedom in describing the input,

  • some have freedom in describing the error

  • Others have freedom in describing the output

  • certain models which describe input, output and error with freedom.

  • Based on the plant conditions, a particular model can be chosen.

  • The choice of a suitable model for a plant is very important, since the parameters to be estimated depends on the model chosen.

  • great care needs to be taken in the choice of model for description of the plant.


Auto regressive model ar model

Auto-Regressive Model (AR model)

  • The Auto-regressive model is given by the equation

    Where,


Modeling methods

  • This model describes a relation only between the output and error.

  • The freedom in describing the output is more than the error.

  • This method is used to describe a plant as the input is not described here, but is usually used in combination with other models.

  • The block diagram representation of AR model is given by:

The parameter vector to be estimated in this model is


Moving average model ma model

Moving Average Model (MA model)

  • The moving average model is given by the equation


Modeling methods

  • This model describes a relation only between the output and error.

  • This model is called Moving Average model because the error here is expressed as a moving average of the white noise.


Modeling methods

  • Much freedom is given for the description of error than the output.

  • This method is seldom used to describe a plant as the input is not described here, but is usually used in combination with other models.

  • The block diagram representation of MA model is given by:

parameter vector to be estimated in this model is


Finite impulse response model fir model

Finite Impulse Response Model (FIR model)

  • The Finite Impulse Response model is given by the equation:


Modeling methods

  • This model describes a relation between the input, error and output.

  • The input can be described with much freedom compared to the error and output.

  • This model can be used to describe plants where much freedom is not required for the description of errors.

  • The block diagram representation of FIR model is:

The parameter vector to be estimated in this model is


Auto regressive model with extraneous extra input arx model

Auto-Regressive model with extraneous (extra) input (ARX model)

  • Known as Equation Error Model, is given by:


Modeling methods

  • this model describes a relation between the input, error and output.

  • Also, the input and output can be described with much freedom compared to the error.

  • This model can be used to describe plants where much freedom is not required for the description of errors.

  • The block diagram representation of ARX model is:


Auto regressive moving average model arma model

Auto-Regressive Moving Average Model (ARMA Model)

  • Model is described by the equation


Modeling methods

  • This model is a combination of Auto-Regressive (AR) model and Moving Average (MA) model.

  • This model gives a relation between output and error. Here both output and error are described with much freedom.

  • This model is not often used to describe plants as input is not

  • considered here.

  • The block diagram representation of ARMA model is:


Auto regressive moving average model with extraneous input armax model

Auto-Regressive Moving Average Model with extraneous input (ARMAX Model)

  • Described by equation


Modeling methods

  • An extension of ARMA model, where an extraneous input (u(t)) is included to the model.

  • Used to describe the plant, as this model describes the input, error and output with full freedom.

  • The block diagram representation of ARMAX model is:


Auto regressive integrated moving average model with extraneous input arimax model

Auto-Regressive Integrated Moving Average Model with eXtraneous input(ARIMAX Model)

  • The models described above are valid only for white noise disturbances. To be able to describe the disturbances which are variable in nature or drifting in nature, ARIMAX model is preferred.

  • The equation describing the ARIMAX model is:


Modeling methods

  • In this, the disturbance is described as a summation of constant part and variable part.

  • Hence, this model can be used to describe systems where the disturbance is drifting in nature.

  • The parameter vector to be estimated


Modeling methods

Note


Parametric estimation techniques

Parametric Estimation Techniques

  • A parametric estimation technique is characterized by a finite dimensional parameter vector.

  • A mapping from the recorded data to the estimated parameter vector.

  • So, in parametric methods, the result of identification can be expressed by a finite dimensional parameter vector in matrix form.

  • Some of the parametric estimation techniques are:

  • Least Squares (LS) Estimation

  • Recursive Least Squares (RLS) Estimation

  • Extended Least Squares (ELS) Estimation and

  • Least Mean Square (LMS) Estimation


Least square estimation

Least Square Estimation

  • Karl Friedrich Gauss

  • least squares principle:

  • Stated that “the unknown parameter of a mathematical model should be chosen in such a way that the sum of squares of the difference between the actually observed and computed values, multiplied by numbers that measure degree of precision, is minimal”.


Modeling methods

  • simple for a mathematical model that can be written in the form


Modeling methods

  • called a regression model.

  • The model is indexed by the variable i, which often denotes time.

  • The variables called the regression variables which is usually a set of inputs.

  • As per Least Squares principle, the parameter vector should be chosen to minimize the Least-Square loss function given by:


Modeling methods

  • The first term on the right hand side is independent of θ. The second term is always positive.

  • Hence the minimum is obtained for:


Example

Example


Modeling methods

  • Statistical Properties of Least Square Estimation Technique


Recursive least square rls estimation

Recursive Least Square (RLS) Estimation

  • In adaptive controllers, the observations are obtained sequentially in real time.

  • to save computation time, the computation can be made recursive in nature.

  • The computation of least square estimate can be arranged in such a way that the results obtained at time t – 1 can be used to get the estimates at time t.


Rls technique for time varying parameters

RLS technique for time varying parameters

  • In the Least Square model given by equation the parameters are assumed to be constant,

  • but in practical situations, they are time-varying in nature.

  • The least squares method can be extended for the following two cases:

  • The parameters are assumed to change abruptly but infrequently.

  • The parameters are changing continuously but slowly.


Parameter changes abruptly but infrequently

Parameter changes abruptly but infrequently:

  • The case of abrupt parameter changes can be covered by resetting.

  • The matrix P in the least squares algorithm is periodically reset to αI, where α is a large number.

  • This implies that the gain K (t) in the estimator becomes large and the estimate can be updated with a larger step.

  • more sophisticated version is to run n estimators in parallel, which are reset sequentially.

  • The estimate is then chosen by using some decision logic.


Parameters are slowly time varying in nature

Parameters are slowly time-varying in nature:

  • The case of slowly time-varying parameters can be covered by relatively simple mathematical models. The loss function, in this case is taken to be:

parameter λ is called the forgetting factor or discounting factor. 0 < λ ≤ 1. The method is therefore called exponential forgetting or exponential

discounting.


Simplified algorithms algorithms that avoid updating of the p matrix

Simplified Algorithms (Algorithms that avoid Updating of the P matrix)

  • The recursive least-squares algorithm has two sets of state variables and P which must be updated at each step.

  • For large n, the updating of matrix P dominates the computing effort.

  • There are several simplified algorithms that avoid updating the P matrix at the cost of slower convergence.

  • Some algorithms that avoid updating of the P matrix are:

  • Kaczmarz’s Projection algorithm

  • Projection Algorithm (Normalized projection algorithm)

  • Stochastic approximation algorithm and

  • Least mean square algorithm.


Kaczmarz s projection algorithm

Kaczmarz’s Projection algorithm:


Projection algorithm normalized projection algorithm

Projection algorithm: (Normalized Projection Algorithm)


Stochastic approximation algorithm

Stochastic approximation algorithm:


Least mean square algorithm

Least mean square algorithm

  • A further simpler algorithm is obtained which eliminates the term P(t). The algorithm is the least mean square algorithm given by

Limitations of Standard Least Squares Algorithm:

it can be directly applied only for systems, which can be expressed in terms of the regression model.

To apply Least Squares principle to a system, it needs to be first converted to regression model.


Extended least squares estimation algorithm

Extended Least Squares Estimation Algorithm:


  • Login