1 / 42

WIENER FILTERS

WIENER FILTERS. Linear Opt imum Filtering: Statement. Complex-valued stationary (at least w.s.s.) stochastic processes. Linear discrete-time filter, w 0 , w 1 , w 2 , ... (IIR or FIR (inherently stable) ) y(n) is the estimate of the desired response d(n)

Anita
Download Presentation

WIENER FILTERS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WIENER FILTERS ELE 774 - Adaptive Signal Processing

  2. Linear Optimum Filtering: Statement • Complex-valued stationary (at least w.s.s.)stochastic processes. • Linear discrete-time filter, w0, w1, w2, ... (IIR or FIR (inherently stable)) • y(n) is the estimate of the desired response d(n) • e(n) is the estimation error, i.e., difference bw. the filter output and the desired response ELE 774 - Adaptive Signal Processing

  3. Linear Optimum Filtering: Statement • Problem statement: • Given • Filter input, u(n), • Desired response, d(n), • Find the optimum filter coefficients, w(n) • To make the estimation error “as small as possible” • How? • An optimization problem. ELE 774 - Adaptive Signal Processing

  4. Linear Optimum Filtering: Statement • Optimization (minimization) criterion: • 1. Expectation of the absolute value, • 2. Expectation (mean) square value, • 3. Expectation of higher powers of the absolute value of the estimation error. • Minimization of the Mean Square value of the Error (MSE) is mathematically tractable. • Problem becomes: • Design a linear discrete-time filter whose output y(n) provides an estimate of a desired response d(n), given a set of input samples u(0), u(1), u(2) ..., such that the mean-square value of the estimation error e(n), defined as the difference between the desired response d(n) and the actual response, is minimized. ELE 774 - Adaptive Signal Processing

  5. Principle of Orthogonality • Filter output is the convolution of the filter IR and the input ELE 774 - Adaptive Signal Processing

  6. Principle of Orthogonality • Error: • MSE (Mean-Square Error) criterion: • Square→ Quadratic Func. → Convex Func. • Minimum is attained when • (Gradient w.r.t. optimization variable w is zero.) ELE 774 - Adaptive Signal Processing

  7. Derivative in complex variables • Let • then derivation w.r.t. wk is • Hence or !!! J: real, why? !!! ELE 774 - Adaptive Signal Processing

  8. Principle of Orthogonality • Partial derivative of J is • Using and • Hence ELE 774 - Adaptive Signal Processing

  9. Principle of Orthogonality • Since , or • The necessary and sufficient condition for the cost function J to attain its minimum value is, for the corresponding value of the estimation error eo(n) to be orthogonal to each input sample that enters into the estimation of the desired response at time n. • Error at the minimum is uncorrelated with the filter input! • A good basis for testing whether the linear filter is operating in its optimum condition. ELE 774 - Adaptive Signal Processing

  10. Principle of Orthogonality • Corollary: If the filter is operating in optimum conditions (in the MSE sense) • When the filter operates in its optimum condition, the estimate of the desired response defined by the filter output yo(n) and the corresponding estimation error eo(n) are orthogonal to each other. ELE 774 - Adaptive Signal Processing

  11. Minimum Mean-Square Error • Let the estimate of the desired response that is optimized in the MSE sense, depending on the inputs which span the space i.e. ( ) be • Then the error in optimal conditions is or • Also let the minimum MSE be (≠0) HW: try to derive this relation from the corollary. ELE 774 - Adaptive Signal Processing

  12. Minimum Mean-Square Error • Normalized MSE: Let Meaning • If ε is zero, the optimum filter operates perfectly, in the sense that there is complete agreement bw. d(n) and . (Optimum case) • If ε is unity, there is no agreement whatsoever bw. d(n) and (Worst case) ELE 774 - Adaptive Signal Processing

  13. Wiener-Hopf Equations • We have (principle of orthogonality) • Rearranging where Wiener-Hopf Equations (set of infinite eqn.s) ELE 774 - Adaptive Signal Processing

  14. Wiener-Hopf Equations • Solution – Linear Transversal (FIR) Filter case • M simultaneous equations ELE 774 - Adaptive Signal Processing

  15. Wiener-Hopf Equations (Matrix Form) • Let • Then and ELE 774 - Adaptive Signal Processing

  16. Wiener-Hopf Equations (Matrix Form) • Then the Wiener-Hopf equations can be written as where is composed of the optimum (FIR) filter coefficients. The solution is found to be • Note that R is almost always positive-definite. ELE 774 - Adaptive Signal Processing

  17. Error-Performance Surface • Substitute → • Rewriting ELE 774 - Adaptive Signal Processing

  18. Error-Performance Surface • Quadratic function of the filter coefficients → convex function, then or Wiener-Hopf Equations ELE 774 - Adaptive Signal Processing

  19. Minimum value of Mean-Square Error • We calculated that • The estimate of the desired response is Hence its variance is Then At wo. (Jmin is independent of w) ELE 774 - Adaptive Signal Processing

  20. Canonical Form of the Error-Performance Surface • Rewrite the cost function in matrix form • Next, express J(w) as a perfect square in w • Then, by substituting • In other words, ELE 774 - Adaptive Signal Processing

  21. Canonical Form of the Error-Performance Surface • Observations: • J(w) is quadratic in w, • Minimum is attained at w=wo, • Jmin is bounded below, and is always a positive quantity, • Jmin>0 → ELE 774 - Adaptive Signal Processing

  22. Canonical Form of the Error-Performance Surface • Transformations may significantly simplify the analysis, • Use Eigendecomposition for R • Then • Let • Substituting back into J • The transformed vector v is called as the principal axes of the surface. a vector Canonical form ELE 774 - Adaptive Signal Processing

  23. Canonical Form of the Error-Performance Surface v2 (λ2) wo J(wo)=Jmin J(w)=c curve J(v)=c curve w2 Jmin Q Transformation v1 (λ1) w1 ELE 774 - Adaptive Signal Processing

  24. Multiple Linear Regressor Model • Wiener Filter tries to match the filter coefficients to the model of the desired response, d(n). • Desired response can be generated by • 1. a linear model, a • 2. with noisy observable data, d(n) • 3. noise is additive and white. • Model order is m, i.e. • What should the length of the Wiener filter be to achive min. MSE? ELE 774 - Adaptive Signal Processing

  25. Multiple Linear Regressor Model • The variance of the desired response is • But we know that • where wo is the filter optimized w.r.t. MSE (Wiener filter) of length M. • 1. Underfitted model: M<m • Performance improves quadratically with increasing M. • Worst case: M=0, • 2. Critically fitted model: M=m • wo=a, R=Rm, ELE 774 - Adaptive Signal Processing

  26. Multiple Linear Regressor Model • 3. Overfitted model: M>m • Filter longer than the model does not improve performance. ELE 774 - Adaptive Signal Processing

  27. Example We do not know the values • Let • the model length of the desired response d(n) be 3, • the autocorrelation matrix of the input u(n) be (for conseq. 3 samples) • The cross-correlation of the input and the (observable) desired response be • The variance of the observable data (desired response) be • The variance of the additive white noise be ELE 774 - Adaptive Signal Processing

  28. Example • Question: • a) Find Jmin for a (Wiener) filter length of M=1,2,3,4 • b) Draw the error-performance (cost) surface for M=2 • c) Compute the canonical form of the error-performance surface. • Solution: • a) we know that and then ELE 774 - Adaptive Signal Processing

  29. Example • Solution, b) ELE 774 - Adaptive Signal Processing

  30. Example • Solution, c) we know that • where for M=2 • Then v2 (λ2) Jmin v1 (λ1) ELE 774 - Adaptive Signal Processing

  31. Application – Channel Equalization • Transmitted signal passes through the dispersive channel and a corrupted version (both channel & noise) of x(n) arrives at the receiver. • Problem: Design a receiver filter so that we can obtain a delayed version of the transmitted signal at its output. • Criterion: 1. Zero Forcing (ZF) 2. Minimum Mean Square Error (MMSE) x(n) ε(n) y(n) z(n) Channel, h Filter, w + + - x(n-δ) Delay, δ ELE 774 - Adaptive Signal Processing

  32. Application – Channel Equalization • MMSE cost function is: • Filter output • Filter input Convolution Convolution ELE 774 - Adaptive Signal Processing

  33. Application – Channel Equalization • Combine last two equations • Compact form of the filter output • Desired signal is x(n-δ), or Convolution Toeplitz matrix performs convolution ELE 774 - Adaptive Signal Processing

  34. Application – Channel Equalization • Rewrite the MMSE cost function • Expanding (data and noise are uncorrelated E{x(n)v(k)}=0 for all n,k) • Re-expressing the expectations ELE 774 - Adaptive Signal Processing

  35. Application – Channel Equalization • Quadratic function → gradient is zero at minimum • The solution is found as • And Jmin is • Jmin depends on the design parameter δ ELE 774 - Adaptive Signal Processing

  36. Application – Linearly Constrained Minimum - Variance Filter • Problem: • 1. We want to design an FIR filter which suppresses all frequency components of the filter input except ωo, with a gain of g at ωo. ELE 774 - Adaptive Signal Processing

  37. Application – Linearly Constrained Minimum - Variance Filter • Problem: • 2. We want to design a beamformer which can resolve an incident wave coming from angle θo (with a scaling factor g), while at the same time suppress all other waves coming from other directions. ELE 774 - Adaptive Signal Processing

  38. Application – Linearly Constrained Minimum - Variance Filter • Although these problems are physically different, they are mathematically equivalent. • They can be expressed as follows: • Suppress all components (freq. ω or dir. θ) of a signal while setting the gain of a certain componentconstant (ωo or θo) • They can be formulated as a constrained optimization problem: • Cost function: variance of all components (to be minimized) • Constraint (equality): the gain of a single component has to be g. • Observe that there is no desired response!. ELE 774 - Adaptive Signal Processing

  39. Application – Linearly Constrained Minimum - Variance Filter • Mathematical model: • Filter output | Beamformer output • Constraints: ELE 774 - Adaptive Signal Processing

  40. Application – Linearly Constrained Minimum - Variance Filter • Cost function: output power → quadratic → convex • Constraint : linear • Method of Lagrange multipliers can be utilized to solve the problem. • Solution: Set the gradient of J to zero • Optimum beamformer weights are found from the set of equations similar to Wiener-Hopf equations. output power constraint ELE 774 - Adaptive Signal Processing

  41. Application – Linearly Constrained Minimum - Variance Filter • Rewrite the equations in matrix form: • Hence • How to find λ? Use the linear constraint: • to find • Therefore the solution becomes • For θo, wo is • the linearly Constrained Minimum-Variance (LCMV) beamformer • For ωo, wo is • the linearly Constrained Minimum-Variance (LCMV) filter ELE 774 - Adaptive Signal Processing

  42. Minimum-Variance Distortionless Response Beamformer/Filter • Distortionless → set g=1, then • We can show that (HW) • Jmin represents an estimate of the variance of the signal impinging on the antenna array along the direction θ0. • Generalize the result to any direction θ (angular frequency ω): • minimum-variance distortionless response (MVDR) spectrum • An estimate of the power of the signal coming from direction θ • An estimate of the power of the signal coming from frequency ω ELE 774 - Adaptive Signal Processing

More Related