1 / 42

Model Quality and Input Design Issues in Prediction Error Identification

Model Quality and Input Design Issues in Prediction Error Identification . Håkan Hjalmarsson School of Electrical Engineering KTH Stockholm, Sweden Joint work with Märta Barenthin, Henrik Jansson, Jonas Mårtensson and Bo Wahlberg. or Flight 765 to Budapest. Håkan Hjalmarsson

tangia
Download Presentation

Model Quality and Input Design Issues in Prediction Error Identification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model Quality and Input Design Issues in Prediction Error Identification Håkan Hjalmarsson School of Electrical Engineering KTH Stockholm, Sweden Joint work with Märta Barenthin, Henrik Jansson, Jonas Mårtensson and Bo Wahlberg

  2. orFlight 765 to Budapest Håkan Hjalmarsson in collaboration with Henrik Jansson, Jonas Mårtensson and Märta Barenthin Department of Signals, Sensors and Systems Kungliga Tekniska Högskolan (KTH) Stockholm, Sweden

  3. Three Fundamental Issues in Identification • Typically, the more complex a system is, the worse will the model accuracy be for a given input: • The curse of system complexity • Furthermore, we are always a bit worried that our model structure will not capture the true system so that there will be a bias error, which is hard to quantify: • The curse of under-modeling • However, at the same time we are always a bit worried that our model structure is too flexible which gives poor accuracy: • The curse of over-modeling

  4. Quantity of interest: Estimate: Quality measure: Model Quality • A few examples • Frequency response • Impulse response coefficient • L2 system gain • Zeros and poles • Control design performance

  5. By the way: • Some innocent questions from an (anonymous) novice: • Which quantities are difficult to estimate? • What is the rule of thumb for the model quality of a certain quantity? • What is a good input for a specific quantity?

  6. Example:Frequency Function Estimates First approximation (Ljung 1985): Refinement (Ninness, Hjalmarsson, Ljung, Xie): Replace terms in n with where k = poles of input and system and zeros of noise model

  7. But What About Other Quantities? • Impulse response coefficient • L2 system gain • Zeros and poles • Control design performance

  8. This Talk • A variance expression for general quantities • A perspective on model quality from an experiment design point of view. • Key observation: • Optimal experiments emphasize system properties of interest, and ”hide” properties of little interest. • Consequences: • Model quality insensitive to system and model complexity. • Experimental cost depends on amount of system information that is required, not only system complexity.

  9. Outline • The inside of the model quality measure Q • Illuminating (?) example • System assumptions • Optimal input design problem • System complexity • Over-modeling • Under-modeling • Final comments

  10. e Ho v r u y F Go Quality measure: C The Inside of Q • Prediction error identification • General operation: Notation:

  11. The Inside of Q • Taylor approximation: meaning ????

  12. The Inside of Q Alternative expression Is this really better???????

  13. The Inside of Q z-transform of sensitivities wrt the system impulse response • Examples: • the L2-gain: r J(z)=2G(z) • a real nmp zero zk: r J(z)/ 1/(z-zk-1) • the k-th impulse response coefficient: r J(z)=z-k

  14. The Inside of Q

  15. The Inside of Q The space n is defined as the linear span of the rows of the predictor gradient. The space can be written as n=span{H-1(z)T'(z)U(z)} (recall T=[G H], U=transfer function from r & e to u & e ) Example: For an FIR model structure in open loop the space is

  16. The Inside of Q • Gives insight on how different quantities affect accuracy: • External excitation • Feedback • Model structure • J itself • Example: • High model order )n=H2£ H2 indep of model structure

  17. The Inside of Q • Example 1: J = Frequency function • Expression reduces to existing variance formulas • Example 2: J= NMP zero zk, open loop, high order

  18. The Inside of Q Still think it’s too messy? Then forget about the projection Simple rule of thump! Model structure independent! Example: L_2 gain

  19. And Now Back To • Three Fundamental Issues in Identification • The curse of system complexity • The curse of under-modeling • The curse of over-modeling • Let’s study a simple example......

  20. FIR-system Objective: Estimate static gain Example: Static Gain Estimation White input: Variance of static gain estimate » no, n. Bias What is the optimal input for this problem?

  21. Quantity of interest: Estimate: Quality measure: Design problem: Minimize input power subject to Q < g Optimal Experiment Design

  22. q1+q2=constant q2 q1 ^ G(0) ^ qN Input design for static gain estimation Example of confidence ellipsoid for a particular input spectrum Uncertainty interval for G(0)

  23. q1+q2=constant q2 q1 ^ G(0) ^ qN Optimal input design Confidence ellipsoid for a input spectrum such that ellipsoid aligned with quantity of interest Uncertainty interval • Small uncertainty when ellipsoid shaped after quantity of interest

  24. FIR-system Objective: Estimate static gain Example: Static Gain Estimation • Optimal input: Constant • System complexity: Same accuracy regardless of true system order! • Over-modeling: Same accuracy regardless of model order! • Under-modeling: No bias, even for a static model!

  25. q1+q2=constant q2 ^ q1 ^ G(0) Glo(0) ^ qN Why no bias? Confidence ellipsoids , Level curves for LS-criterion • Good low order models if confidence ellipsoid shaped after quantity of interest • This is what optimal input design does! • Bias and variance issues not conflicting!

  26. Conclusions from example • Static gain emphasized in system output • Individual impulse response coefficients not identifiable – Hidden by optimal experiment • Generalization?????

  27. System and Model Assumptions FIR system: FIR model: Estimate based on sample size N:

  28. Main Result 1: System complexity Assumption: True system in model class (n=no) Then the optimal input spectrum is given by u=(2/ N) £ with cost E u2= 2o/( N) (cost only depends on o) Same input optimal regardless of system complexity result obtained from the Q-expression

  29. FIR-system Example: NMP-zero Estimation Objective: Estimate NMP zero zo to within a certain accuracy. dJ/dqk=z0-k , k=0,1,...no ) Same AR-1 input optimal for estimation of a zero at zo regardless of system order!

  30. Other examples • Impulse response coefficient • L2 gain • Control design performance

  31. Main Result 2: Over-modeling If for n¸ no: still holds, then same input spectrum optimal No penalty for over-modeling! Example: NMP-zero estimation. No penalty for overmodelling if

  32. Model Quality and Under-modeling • The optimal input design we have discussed is based on full-order model assumption. • In reality we always use models of reduced complexity • What estimate do we get when the input design is based on a full-order model but when the actual identification uses a simpler model?

  33. Main Result 3: Under-modeling

  34. Main Result 3: Under-modeling • Same correlations as before ) No bias error for optimal input!

  35. FIR-system Example: NMP-zero Estimation • Objective: Estimate NMP zero zo to within a certain accuracy. • Optimal input: AR-1 input. • Second order model estimates zo consistently regardless of no!

  36. Cost vs information complexity • So far focus on a scalar quantity J • System complexity does not influence experimental cost (as measured by input power) • What if more system information is to be estimated? • Preliminary results:

  37. Quantity of interest: Input cost vs system complexity when more information estimated • Input power affine in wB£ no • Both system/model and information complexity important • The less information, the less the influence of the system

  38. Summary • Optimal experiments emphasize system properties of interest, and ”hide” properties of little interest. • Consequences: • Model quality insensitive to system and model complexity • Experimental cost depends on system properties of interest, not only system complexity • Let sleeping dogs lie! • Practiced by practical practicioners

  39. Current Trends in Experiment Design • Plant friendly inputs • MIMO • Closed loop • Optimal solution often depends on the true system to be estimated • Robust designs • Prior information:  in some set  • Analytic methods • Relaxation methods such as Sums-Of-Squares

  40. Current Trends in Experiment Design • Adaptive approaches • Convergence & Asymptotic Optimality • A bright young star is working on this with very nice results • Who? of course!!!!!

  41. Adaptive Input Design of L2-gain FIR system Variance of L2 gain estimate vs sample size Laszlo’s adaptive scheme Optimal (o known)

  42. Köszönöm szèpen Gratulàlok Laszlo

More Related