Modeling methods
Download
1 / 49

Modeling methods - PowerPoint PPT Presentation


  • 120 Views
  • Uploaded on

Modeling methods. Used in system identification and in MRAS. Linear in parameter models: (Models for describing linear systems). Auto-Regressive Model (AR model) Moving Average Model (MA model) Finite Impulse Response Model (FIR model)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Modeling methods' - cordero-chavez


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Modeling methods

Modeling methods

Used in system identification and in MRAS


Linear in parameter models models for describing linear systems
Linear in parameter models: (Models for describing linear systems)

  • Auto-Regressive Model (AR model)

  • Moving Average Model (MA model)

  • Finite Impulse Response Model (FIR model)

  • Auto-Regressive model with extraneous (extra) input (ARX model)

  • Auto-Regressive Moving Average model (ARMA model)

  • Auto-Regressive Moving Average model with extraneous input (ARMAX model)

  • Auto-Regressive Integrated Moving Average model (ARIMA model)

  • Auto-Regressive Integrated Moving Average model with extraneous input (ARIMAX

  • model)


Contd
Contd.. systems)

  • Each of the above models has a way of describing the relationship between the input, output and error.

  • Some models have much freedom in describing the input,

  • some have freedom in describing the error

  • Others have freedom in describing the output

  • certain models which describe input, output and error with freedom.

  • Based on the plant conditions, a particular model can be chosen.

  • The choice of a suitable model for a plant is very important, since the parameters to be estimated depends on the model chosen.

  • great care needs to be taken in the choice of model for description of the plant.


Auto regressive model ar model
Auto-Regressive Model (AR model) systems)

  • The Auto-regressive model is given by the equation

    Where,


  • This model describes a relation only between the output and error.

  • The freedom in describing the output is more than the error.

  • This method is used to describe a plant as the input is not described here, but is usually used in combination with other models.

  • The block diagram representation of AR model is given by:

The parameter vector to be estimated in this model is


Moving average model ma model
Moving Average Model (MA model) error.

  • The moving average model is given by the equation



parameter vector to be estimated in this model is


Finite impulse response model fir model
Finite Impulse Response Model (FIR model) output.

  • The Finite Impulse Response model is given by the equation:


  • This model describes a relation between the input, error and output.

  • The input can be described with much freedom compared to the error and output.

  • This model can be used to describe plants where much freedom is not required for the description of errors.

  • The block diagram representation of FIR model is:

The parameter vector to be estimated in this model is


Auto regressive model with extraneous extra input arx model
Auto-Regressive model with extraneous (extra) input (ARX model)

  • Known as Equation Error Model, is given by:


  • this model describes a relation between the input, error and output.

  • Also, the input and output can be described with much freedom compared to the error.

  • This model can be used to describe plants where much freedom is not required for the description of errors.

  • The block diagram representation of ARX model is:


Auto regressive moving average model arma model
Auto-Regressive Moving Average Model (ARMA Model) output.

  • Model is described by the equation


  • This model is a combination of Auto-Regressive (AR) model and Moving Average (MA) model.

  • This model gives a relation between output and error. Here both output and error are described with much freedom.

  • This model is not often used to describe plants as input is not

  • considered here.

  • The block diagram representation of ARMA model is:




Auto regressive integrated moving average model with extraneous input arimax model
Auto-Regressive Integrated Moving Average Model with eXtraneous input(ARIMAX Model)

  • The models described above are valid only for white noise disturbances. To be able to describe the disturbances which are variable in nature or drifting in nature, ARIMAX model is preferred.

  • The equation describing the ARIMAX model is:



Note constant part and variable part.


Parametric estimation techniques
Parametric Estimation Techniques constant part and variable part.

  • A parametric estimation technique is characterized by a finite dimensional parameter vector.

  • A mapping from the recorded data to the estimated parameter vector.

  • So, in parametric methods, the result of identification can be expressed by a finite dimensional parameter vector in matrix form.

  • Some of the parametric estimation techniques are:

  • Least Squares (LS) Estimation

  • Recursive Least Squares (RLS) Estimation

  • Extended Least Squares (ELS) Estimation and

  • Least Mean Square (LMS) Estimation


Least square estimation
Least Square Estimation constant part and variable part.

  • Karl Friedrich Gauss

  • least squares principle:

  • Stated that “the unknown parameter of a mathematical model should be chosen in such a way that the sum of squares of the difference between the actually observed and computed values, multiplied by numbers that measure degree of precision, is minimal”.



  • called a regression model. form

  • The model is indexed by the variable i, which often denotes time.

  • The variables called the regression variables which is usually a set of inputs.

  • As per Least Squares principle, the parameter vector should be chosen to minimize the Least-Square loss function given by:





Recursive least square rls estimation
Recursive Least Square (RLS) Estimation form

  • In adaptive controllers, the observations are obtained sequentially in real time.

  • to save computation time, the computation can be made recursive in nature.

  • The computation of least square estimate can be arranged in such a way that the results obtained at time t – 1 can be used to get the estimates at time t.


Rls technique for time varying parameters
RLS technique for time varying parameters form

  • In the Least Square model given by equation the parameters are assumed to be constant,

  • but in practical situations, they are time-varying in nature.

  • The least squares method can be extended for the following two cases:

  • The parameters are assumed to change abruptly but infrequently.

  • The parameters are changing continuously but slowly.


Parameter changes abruptly but infrequently
Parameter changes abruptly but infrequently: form

  • The case of abrupt parameter changes can be covered by resetting.

  • The matrix P in the least squares algorithm is periodically reset to αI, where α is a large number.

  • This implies that the gain K (t) in the estimator becomes large and the estimate can be updated with a larger step.

  • more sophisticated version is to run n estimators in parallel, which are reset sequentially.

  • The estimate is then chosen by using some decision logic.


Parameters are slowly time varying in nature
Parameters are slowly time-varying in nature: form

  • The case of slowly time-varying parameters can be covered by relatively simple mathematical models. The loss function, in this case is taken to be:

parameter λ is called the forgetting factor or discounting factor. 0 < λ ≤ 1. The method is therefore called exponential forgetting or exponential

discounting.


Simplified algorithms algorithms that avoid updating of the p matrix
Simplified Algorithms (Algorithms that avoid Updating of the P matrix)

  • The recursive least-squares algorithm has two sets of state variables and P which must be updated at each step.

  • For large n, the updating of matrix P dominates the computing effort.

  • There are several simplified algorithms that avoid updating the P matrix at the cost of slower convergence.

  • Some algorithms that avoid updating of the P matrix are:

  • Kaczmarz’s Projection algorithm

  • Projection Algorithm (Normalized projection algorithm)

  • Stochastic approximation algorithm and

  • Least mean square algorithm.





Least mean square algorithm
Least mean square algorithm P matrix)

  • A further simpler algorithm is obtained which eliminates the term P(t). The algorithm is the least mean square algorithm given by

Limitations of Standard Least Squares Algorithm:

it can be directly applied only for systems, which can be expressed in terms of the regression model.

To apply Least Squares principle to a system, it needs to be first converted to regression model.



ad