1 / 43

Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions

Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions. Onur G. Guleryuz oguleryuz@erd.epson.com Epson Palo Alto Laboratory Palo Alto, CA. (Full screen mode recommended. Please see movies.zip file for some movies, or email me.

wyome
Download Presentation

Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions Onur G. Guleryuz oguleryuz@erd.epson.com Epson Palo Alto Laboratory Palo Alto, CA (Full screen mode recommended. Please see movies.zip file for some movies, or email me. Audio of the presentation will be uploaded soon.)

  2. Overview • Problem definition. • Notation and main idea. • Difficulties with nonstationary statistics. • Algorithm. • Properties. • Conclusion. • Five examples and movies to discuss transform properties. • Many more (~20) simulation examples, movies, etc. Please stay after questions. Working paper: http://eeweb.poly.edu/~onur/online_pub.html (google: onur guleryuz)

  3. Any image prediction scenario. Any signal prediction scenario. Problem Statement Use surrounding spatial information to recover lost block via adaptive sparse reconstructions. Image Lost Block Applications: Error concealment, damaged images, ... Generalizations: Irregularly shaped blocks, partial information, ...

  4. signal transform basis Notation: Transforms transform coefficient (scalar) Assume orthonormal transforms:

  5. : the index set of significant coefficients, (K largest) : the index set of insignificant coefficients, (N-K smallest) “Nonlinear Approximation Based Image …” Notation: Approximation • Keep K<N coefficients. Linear approximation: apriori ordering Nonlinear approximation: signal dependent ordering

  6. Notation: Sparse sparse classes for linear approximation sparse classes for nonlinear approximation Linear app: Nonlinear approximation:

  7. available pixels lost pixels (assume zero mean) Main Idea 1. Original Image 2. Lost Block 3. Predicted • Fix transform basis • Given T>0 and the observed signal

  8. Sparse Classes x x Pixel coordinates for a “two pixel” image Transform coordinates 2T Linear app: Nonlinear app: class(K,T) or convex set convex set non-convex, star-shaped set Rolf Schneider, ``Convex Bodies : The Brunn-Minkowski Theory,’’ Cambridge University Press, March 2003. Onur G. Guleryuz, E. Lutwak, D. Yang, and G. Zhang, ``Information-Theoretic Inequalities for Contoured Probability Distributions,'‘ IEEE Transactions on Information Theory, vol. 48, no. 8, pp. 2377-2383, August 2002.

  9. Examples 1. Interested in edges, textures, …, and combinations (not handled well in literature) +9.37 dB +8.02 dB +11.10 dB +3.65 dB 2. MSE improving. 3. Image prediction has a tough audience!

  10. Regions with different statistics. Perhaps do edge detection? Small amount of data, make a mistake, learn mixed statistics. Need accurate segmentation (very difficult) just to learn! Statistics are not even uniform and order must be very high. Statistics change rapidly without apriori structure. Difficulties with Nonstationary Data • Estimation is a well studied topic, need to infer statistics, then build estimators. • With nonstationary data inferring statistics is very difficult. Higher order method, better edge detection?

  11. Important Properties • This technique does not know anything about images. • Very robust technique. No non-robust edge detection, segmentation, training, learning, etc., required. • Applicable for general nonstationary signals. Use it on speech, audio, seismic data, … • Just pick a transform that provides sparse decompositions using nonlinear approximation, the rest is automated. (DCTs, wavelets, complex wavelets, etc.)

  12. Main Algorithm : orthonormal linear transformation. : linear transform of y ( ). • Start with an initial value. • Get c • Threshold coefficients to determine V(x,T) sparsity constraint • Recoverby minimizing (equations or iterations) • Reduce threshold (found solution becomes initial value).

  13. Progression of Solutions missing pixel Nonlinear app:class(K,T) available pixel constraint x available pixel Pixel coordinates for a “two pixel” image non-convex, star-shaped set Search over Search over Class size increases T decreases Search over …

  14. Estimation Theory Sparsity Constraint = Linear Estimation Proposition 1: Solution of subject to sparsity constraint results in the linear estimate Proposition 2: Conversely suppose that we start with a linear estimate for via } restricted to dimensional subspace sparsity constraints

  15. Required Statistics? None. The statistics required in the estimation are implicitly determined by the utilized transform and V(x). (V(x) is the index set of insignificant coefficients) I will fix G and adaptively determine V(x). (By hard-thresholding transform coefficients)

  16. Apriori v.s. Adaptive Method 1: optimality? Can at best be ensemble optimal for second order statistics. Do not capture nonstationary signals with edges. Method 2: Can at best be THE optimal! J.P. D'Ales and A. Cohen, “Non-linear Approximation of Random Functions”, Siam J. of A. Math 57-2, 518-540, 1997 Albert Cohen, Ingrid Daubechies, Onur G. Guleryuz, and Michael T. Orchard, “On the importance of combining wavelet-based nonlinear approximation with coding strategies,” IEEE Transactions on Information Theory, July 2002.

  17. Conclusion • Simple, robust technique. • Very good and promising performance. • Estimation of statistics not required (have to pick G though). • Applicable to other domains. • Q: Classes of signals over which optimal? A: Nonlinear approximation classes of the transform. • Signal dependent basis to expand classes over which optimal. • Help design better signal representations. (intuitive)

  18. “Periodic” Example +11.10 dB PSNR DCT 9x9 Lower thresholds, larger classes.

  19. Want lots of small coefficients wherever they may be … Properties of Desired Transforms • Localized • Periodic, approximately periodic regions: Transform should “see” the period Example: Minimum period 8 at least 8x8 DCT (~ 3 level wavelet packets). s(n) |S(w)| … … … … -M M zeroes

  20. Periodic Example (period=8) Perf. Rec. DCT 8x8 (Easy base signal, ~fast decaying envelope).

  21. “Periodic” Example +5.91 dB DCT 24x24 (Harder base signal.)

  22. Edge Example +25.51 dB DCT 8x8 (~ Separable, small DCT coefficients except for first row.)

  23. Edge Example +9.18 dB DCT 24x24 (similar to vertical edge, but tilted)

  24. Properties of Desired Transforms • Localized • Periodic, approximately periodic regions: Frequency selectivity • Edge regions: Transform should have the frequency selectivity to “see” the slope of the edge.

  25. DCT2=DCT1 shifted DCT3 + + Overcomplete Transforms DCT block over an edge (not very sparse) DCT block over a smooth region (sparse) DCT1 edge smooth smooth Only the insignificant coefficients contribute. Can be generalized to denoising: Onur G. Guleryuz, ``Weighted Overcomplete Denoising,‘’ Proc. Asilomar Conference on Signals and Systems, Pacific Grove, CA, Nov. 2003.

  26. Properties of Desired Transforms • Localized Nonlinear Approximation does not work for non-localized Fourier transforms. ! • Frequency selectivity for “periodic” + “edge” regions. (Overcomplete DCTs have more mileage since for a given freq. selectivity, have the ~smallest spatial support.) J.P. D'Ales and A. Cohen, “Non-linear Approximation of Random Functions”, Siam J. of A. Math 57-2, 518-540, 1997

  27. “Periodic” Example +3.65 dB DCT 16x16

  28. “Periodic” Example +7.2 dB DCT 16x16

  29. “Periodic” Example +10.97 dB DCT 24x24

  30. Edge Example +12.22 dB DCT 16x16

  31. “Edge” Example +4.04 dB DCT 24x24

  32. Combination Example +9.26 dB DCT 24x24

  33. Combination Example +8.01 dB DCT 16x16

  34. Combination Example +6.73 dB DCT 24x24 (not enough to “see” the period)

  35. Unsuccessful Recovery Example -1.00 dB DCT 16x16

  36. Partially Successful Recovery Example +4.11 dB DCT 16x16

  37. Combination Example +3.77 dB DCT 24x24

  38. “Periodic” Example +3.22 dB DCT 32x32

  39. Edge Example +14.14 dB DCT 16x16

  40. Edge Example +0.77 dB DCT 24x24

  41. Robustness remains the same but changes.

  42. Determination • Start by layering the lost block. Estimate layer at a time. (the lost block is potentially large) Recover layer P by using information from layers 0,…,P-1

  43. th k DCT block w o (k) w o (k) u Hard threshold block k coefficients if OR u DCT (LxL) tiling 1 o (k) < L/2 o (k) < L/2 w u Determination II • Fix T. Look at DCTs that have limited spatial overlap with missing data. • Establish sparsity constraints by thresholding these DCT coefficients with T. (If |c(i)|<T add to sparsity constraints.) Image Outer border of layer 1 Lost block

More Related