1 / 31

A more reliable reduction algorithm for behavioral model extraction

A more reliable reduction algorithm for behavioral model extraction. Dmitry Vasilyev, Jacob White Massachusetts Institute of Technology. Outline. Background Projection framework for model reduction Balanced Truncation algorithm and approximations AISIAD algorithm

alleng
Download Presentation

A more reliable reduction algorithm for behavioral model extraction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A more reliable reduction algorithm for behavioral model extraction Dmitry Vasilyev, Jacob White Massachusetts Institute of Technology

  2. Outline • Background • Projection framework for model reduction • Balanced Truncation algorithm and approximations • AISIAD algorithm • Description of the proposed algorithm • Modified AISIAD and a low-rank square root algorithm • Efficiency and accuracy • Conclusions

  3. Model reduction problem inputs outputs Many (> 104) internal states inputs outputs few (<100) internal states • Reduction should be automatic • Must preserve input-output properties

  4. Differential Equation Model - state A – stable, n xn (large) E – SPD, n xn - vector of inputs - vector of outputs • Model can represent: • Finite-difference spatial discretization of PDEs • Circuits with linear elements

  5. Model reduction problem n – large (thousands)! q – small (tens) Need the reduction to be automatic and preserve input-output properties (transfer function)

  6. Approximation error • Wide-band applications: model should have small worst-case error => maximal difference over all frequencies ω

  7. Projection framework for model reduction • Pick biorthogonal projection matrices W and V • Projection basis are columns of V and W x Vxr x x n q V xr Ax WTAVxr Most reduction methods are based on projection

  8. Projection should preserve important modes u y LTI SYSTEM t t input output P (controllability) Which modes are easier to reach? Q (observability) Which modes produce more output? X (state) • Reduced model retains most controllable and most observable modes • Mode must be both very controllable and very observable

  9. Balanced truncation reduction (TBR) Compute controllability and observability gramians P and Q : (~n3)AP + PAT + BBT =0 ATQ + QA + CTC = 0 Reduced model keeps the dominant eigenspaces of PQ : (~n3) PQvi= λiviwiPQ = λiwi Reduced system: (WTAV, WTB, CV, D) Very expensive. P and Q are dense even for sparse models

  10. Most reduction algorithms effectively separately approximate dominant eigenspaces of Pand Q: • Arnoldi [Grimme ‘97]:V = colsp{A-1B, A-2B, …}, W=VT, approx. Pdomonly • Padé via Lanczos [Feldman and Freund ‘95]colsp(V) = {A-1B, A-2B, …}, - approx. Pdomcolsp(W) = {A-TCT, (A-T )2CT, …},- approx. Qdom • Frequency domain POD [Willcox ‘02], Poor Man’s TBR [Phillips ‘04] colsp(V) = {(jω1I-A)-1B, (jω2I-A)-1B, …}, - approx.Pdom colsp(W) = {(jω1I-A)-TCT, (jω2I-A)-TCT, …},- approx.Qdom However, what matters is the product PQ

  11. RC line (symmetric circuit) V(t) – input i(t) - output • Symmetric, P=Qall controllable states are observable and vice versa

  12. RLC line (nonsymmetric circuit) Vector of states: • P and Q are no longer equal! • By keeping only mostly controllable and/or only mostly observable states, we may not find dominant eigenvectors of PQ

  13. Lightly damped RLC circuit R = 0.008, L = 10-5 C = 10-6 N=100 • Exact low-rank approximations of P and Q of order < 50 leads to PQ≈ 0!!

  14. Lightly damped RLC circuit Top 5 eigenvectorsof Q Top 5 eigenvectors of P Union of eigenspaces of P and Q does not necessarily approximate dominant eigenspace of PQ .

  15. Xi= (PQ)Vi => Vi+1= qr(Xi) “iterate” AISIAD model reduction algorithm Idea of AISIAD approximation: Approximate eigenvectors using power iterations: Viconverges to dominant eigenvectors ofPQ Need to find the product (PQ)Vi How?

  16. Approximation of the product Vi+1=qr(PQVi), AISIAD algorithm Wi≈ qr(QVi) Vi+1≈ qr(PWi) Approximate using solution of Sylvester equation Approximate using solution of Sylvester equation

  17. More detailed view of AISIAD approximation Right-multiply by Wi (original AISIAD) X H, qxq X M, nxq

  18. Modified AISIAD approximation Right-multiply by Vi ^ X H, qxq X Approximate! M, nxq

  19. Modified AISIAD approximation Right-multiply by Vi ^ X H, qxq X Approximate! M, nxq We can take advantage of numerous methods, which approximate P and Q!

  20. Specialized Sylvester equation -M X X A = + H qxq nxq nxn Need only column span of X

  21. Solving Sylvester equation Schur decomposition of H : -M X X A ~ ~ ~ = + ~ Solve for columns of X X

  22. Solving Sylvester equation • Applicable to any stable A • Requires solving q times Schur decomposition of H : Solution can be accelerated via fast MVP Another methods exists, based on IRA, needs A>0 [Zhou ‘02]

  23. Solving Sylvester equation • Applicable to any stable A • Requires solving q times Schur decomposition of H : ^ For SISO systems and P=0 equivalent to matching at frequency points –Λ(WTAW)

  24. Modified AISIAD algorithm LR-sqrt ^ ^ • Obtain low-rank approximations of Pand Q • Solve AXi+XiH+ M = 0, => Xi≈ PWi where H=WiTATWi, M = P(I - WiWiT)ATWi + BBTWi • Perform QR decomposition of Xi =ViR • Solve ATYi+YiF+ N = 0, => Yi≈ QVi where F=ViTAVi, N = Q(I - ViViT)AV + CTCVi • Perform QR decomposition of Yi =Wi+1 Rto get new iterate. • Go to step 2 and iterate. • Bi-orthogonalize WandVand construct reduced model: ^ ^ (WTAV, WTB, CV, D)

  25. For systems in the descriptor form Generalized Lyapunov equations: Lead to similar approximate power iterations

  26. mAISIAD and low-rank square root Low-rank gramians (cost varies) mAISIAD LR-square root (inexpensive step) (more expensive) For the majority of non-symmetric cases, mAISIAD works better than low-rank square root

  27. RLC line example results H-infinity norm of reduction error (worst-case discrepancy over all frequencies) N = 1000, 1 input 2 outputs

  28. Steel rail coolling profile benchmark Taken from Oberwolfach benchmark collection, N=1357 7 inputs, 6 outputs

  29. mAISIAD is useless for symmetric models For symmetric systems (A = AT, B = CT) P=Q, therefore mAISIAD is equivalent to LRSQRT for P,Q of order q ^ ^ RC line example

  30. Cost of the algorithm • Cost of the algorithm is directly proportional to the cost of solving a linear system:(where sjj is a complex number) • Cost does not depend on the number of inputs and outputs (non-descriptor case) (descriptor case)

  31. Conclusions • The algorithm has a superior accuracy and extended applicability with respect to the original AISIAD method • Very promising low-cost approximation to TBR • Applicable to any dynamical system, will work (though, usually worse) even without low-rank gramians • Passivity and stability preservation possible via post-processing • Not beneficial if the model is symmetric

More Related