1 / 38

Model Based Process Monitoring Methods

Model Based Process Monitoring Methods. Model-Based. Residual based observers Parity-space based Causal models Signed digraphs. Statistical. PCA PLS. Process monitoring methods. Process monitoring. + Includes process knowledge − Needs process models. + Easier to implement

frisco
Download Presentation

Model Based Process Monitoring Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model Based Process Monitoring Methods

  2. Model-Based Residual based observers Parity-space based Causal models Signed digraphs Statistical PCA PLS Process monitoring methods Process monitoring + Includes process knowledge − Needs process models + Easier to implement − Don’t include process knowledge Data-Based Quantitative Qualitative Rule-based (Fuzzy) Neural Networks introductory Scope of the course PCA PLS + GA and some Control

  3. Topics • 1. Causal model (Qualitative model) based method • 2. Noise residual detection method • 3. Analytical FDI methods (Quantitative mathematic model) • Observer based method • Parameter estimation method

  4. Analytical methods- Model Based FDI(Fault detection and isolation)

  5. Analytical methods for fault detection • Assumption: mathematic model exists for the process • Generic Conceptual structure:

  6. Methods Classification • Parity space (parity equation) approaches • Observer-based approaches • Luenberger observers • Kalman filters • Parameter estimation approaches

  7. Fault detection based on observers • A state observer is used on a system where direct access to the state variables is not possible. If the system is observable, then state observers can be designed to estimate the signals that cannot be measured. There are several types of obsevers: e.g. Kalman filters, Luenberger observers etc. • The observer-based approach is appropriate if the faults are associated with fault in actuators, sensors or immeasurable state variables • A detailed mathematical model is required (preferably first principles) • A residual is defined between measured and estimated output or (in some cases between measured and estimated state)

  8. Luenberger observers Design • Assume that the process is described by • The state variables are not measured directly • The state and output can be estimated by • H is so called observer gain and need to be designed Observer equations

  9. Luenberger observers Design • The estimation errors are given by • In order to make the estimation errors zero asymptotically, the matrix H is designed such that eigenvalues of matrix (A-HC) are inside of unitary circle (For discrete system). (Command: place() in matlab is available for design) • Δy is the residual that forms the basis of a observer-based fault detection and isolation system (FDI-system)

  10. Luenberger observers block diagram Process u(k) y(k) x(k+1)=Ax(k)+Bu(k) y(k)=Cx(k) B ^ + − x(k+1) + + r Δy(k) A C q-1 + + ^ y(k) q-1 H Delay operator Observer

  11. Observer Scheme (1) • In order to isolate faults different observer schemes are developed

  12. Fault signature of DOS

  13. Observer Scheme (2)

  14. Fault signature of GOS

  15. Kalman Filter for state estimation • Motivation • Luenberger observer designs the observer gain H arbitrary, while Kalman filter designs the observer gain by optimizing the estimation error covariance • Measurements from process usually are corrupted by the sensor noise and the process noise, Kalman filter is able to estimate the state considering these noise

  16. Kalman filter • Assume the process is described as: • where w(k) and v(k) are process noise and sensor noise respectively. They are assumed to be independent (of each other), white, and with normal probability distributions

  17. Kalman filter • Problem statement: The aim of the kalman filter is to design the observer gain Kk with known Q and R such that the is minimized

  18. Kalman filter • Algorithm is two steps for each time instant • Step 1: Time update (predict). This step predicts the estimation of state from the information of the previous state

  19. Kalman filter • Step2: Measurement update (filter) This step use the information contained in the measurement from current time instant to modify the estimation from step1

  20. Kalman filter

  21. Kalman filter scheme • Kalman filter scheme can be applied to both DOS and GOS scheme for fault detection and isolation.

  22. Fault detection based on parameter estimation • For parameter estimation, the residuals are the difference between the nominal model parameters and the estimated model parameters • Deviations in the model parameters serve as the basis for detecting and isolating faults • The method is appropriate if the process faults are associated with changes in model parameters • The model parameter are unmeasured but can be estimated using standard parameter estimation techniques • Parameters usually have physical meanings, which can provide very useful information to locate the fault in process

  23. Steady state parameter estimation • The simplest case is when we identify model parameters from steady state data. For the linear case Nonlinear case:

  24. Example: Tank with cooling • The tank is modeled as: • If we assume steady state we get: Tcool

  25. Example: Tank with cooling • Suppose that we are interested in faults related to the parameters G and a • In steady state we can estimate these parameters by using the following formulas • If the parameters deviate from nominal values a fault has been detected

  26. Example: Tank with cooling Fault related to the heat-exchanger Fault related to the valve

  27. Steady state parameter estimation • To better handle the noise it is recommended to include several measurement points when calculating the parameter estimate • If the system parameters appear linearly in the equations we can solve the problem with standard least squares (the pseudo-inverse) X: data-matrix with inputs Y: data-matrix with outputs

  28. Sliding window Steady state parameter estimation • If process model is linear as

  29. Dynamic parameter estimation • Assume the process is described by the difference equation: y(k)+a1*y(k-1)+a2*y(k-2)…+am*y(k-m)=b0+b1*x(k-1)+b2*x(k-2)+…bn*x(k-n) y(k)=Ø(k)T*Θ where Ø(k)T=[y(k-1), y(k-2),..y(k-m), x(k-1), x(k-2),..x(k-n)], Θ=[-a1, -a2,…-am,b0,b1,..bn]T

  30. Dynamic parameter estimation • Similarly the parameter Θ can be estimated by the sliding Windows with least square. y x Process Form Ø delays delays Form sliding window of Ø Calculate Θ

  31. Dynamic parameter estimation • Recursive least square method is an another method to estimate the dynamic parameters on line. • For the process model y(k)=Ø(k)T*Θ the estimations of the parameters are given as: λ is the forgetting factor

  32. Fault detection based on parameter estimation • In the nonlinear case more advanced optimization methods are needed to determine the parameter values • Also linear dynamic cases with noise may need more advanced parameter estimation methods

  33. Summary of the course • Topics covered: • process monitoring basics • Process monitoring loop • data treatment • Univariate/multivariate statistics • Principal component analysis • mathematics • monitoring indexes • how to use • fault detection and diagnosis • contribution plots • PCA derivation

  34. Summary of the course • Topics covered (cont’d) • PCA limitations • PCA extensions • Nonlinear, dynamic, partial, recursive, multiscale • PLS • mathematically • Nipals • PLS prediction • Selection of latent variables • fault detection and diagnosis

  35. Summary of the course • Topics covered (cont’d) • neural networks • perceptron, adaline, madaline • MLP, kohonen, recurrent, radial basis • Monitoring with neural networks • dynamic modeling with neural networks (recurrent) • Fundamentals of fuzzy logic • membership functions • inference (use of individual rules) • defuzzification • Mamdami, Sugeno type • Fuzzy logic in monitoring • ANFIS

  36. Summary of the course • Topics covered (cont’d) • Improved control using intelligent methods • Fuzzy tuner • Neural network based control • Genetic algorithms • encoding • fitness • genetic operations: mutation, crossover, selection • Population-based incremental learning

  37. Summary of the course • Topics covered (cont’d) • Causal directed graph methods • CDG with state space model • Residual generation • Residual evaluation (CUSUM, later) • fault location and type rules • Analytical methods • observer based methods • Luenberger observer • Different scheme of observer • Kalman filter • parameter estimation based methods • Steady state parameter estimation • Dynamic parameter estimation • CUSUM method

  38. Exam • 31.10 8-12 o’clock, room ke1/ke2 • The exam will consist of 4-5 questions, some with calculations and some were you are asked to write and explain about some of the topics covered in the course • All the homework need to be returned to the assistant before the exam. The first 3 have to be accepted before the exam. The credits for the course will be given when all the homework assignments have been accepted

More Related