1 / 22

Entropy-constrained overcomplete -based coding of natural images

Entropy-constrained overcomplete -based coding of natural images. André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University. Outline. Motivation Overcomplete -based coding: overview Entropy-constrained overcomplete -based coding Experimental results Conclusion Future work.

bisa
Download Presentation

Entropy-constrained overcomplete -based coding of natural images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University

  2. Outline • Motivation • Overcomplete-based coding: overview • Entropy-constrained overcomplete-based coding • Experimental results • Conclusion • Future work

  3. Motivation (1) • Study of new (and unusual) schemes for image compression • Recently, new methods have been developed using the overcomplete approach • Restricted scenarios for compression • Did not fully exploit this approach’s characteristics for compression

  4. Motivation (2) Why?Sparsity on coefficients  better overall RD

  5. Overcompletecoding: overview (1) • K > Nimplies: • Bases are not linearlyindependent • Example: • 8x8 blocks: N = 64 basis functions are needed to span the space of all possible signals • Overcomplete basis could have K = 128 • Two main tasks: • Sparsecoding • Dictionarylearning

  6. Overcomplete coding: overview (2) • Sparse coding (“atom decomposition”) • Compute the representation coefficients x based on the signal y (given) and dictionary D (given) • overcompleteDInfinite solutions  approxim. • Commonly used algorithms: Matching Pursuits (MP), Orthogonal Matching Pursuits (OMP)

  7. Overcomplete coding: overview (3) Sparse coding (OMP) • Input: Dictionary , signal , number of non-zero coefficients (NNZ) (orerror target ε) • Output: Coefficient vector x • Set r = (r: residual) • Project r on every basis of • Select from with maximum projection • Stop if (or ||r||2 < ε). Otherwise, go to 2

  8. Overcomplete coding: overview (4) • Dictionarylearning • Twobasic stages (analogywith K-means) • Sparsecoding stage: use a pursuitalgorithm to computex(OMP isusuallyemployed) • Dictionary update stage:adopt a particularstrategy for updating the dictionary • Convergence issues: as first stage does not guarantee best match, costcanincrease and convergence cannotbeassured

  9. Overcomplete coding: overview (5) • Dictionarylearning • Most relevant algorithms in the literature: K-SVD and MOD • Sparsecoding stage isdone in the sameway • Codebook update stage isdifferent: • MOD • Update entiredictionaryusing optimal adjustment for a given coefficients matrix • K-SVD • Update each basis one at a time using SVD formulation • Introduces change in dictionary and coefficients

  10. Entropy-const. OC-based coding (1) • We introduce a compression schemewhichemploysentropy-constrained stages • RD-OMP • Introducedby Gharavi-Alkhansar(ICIP 1998), uses the Lagrangiancostwith variable NNZ coefficients to select basis vectors • EC Dictionary Learning • Introducedin thiswork, uses a frameworkinspired in EC VQ to select basis vectors

  11. Entropy-const. OC-based coding (2) • RD-OMP – keyideas • Introduction of Lagrangiancost • Estimation of rate cost: (isfixed) • Stoppingcriterion/variable NNZ coefficients • Once no more improvementisreached on the Lagrangiancost, algorithm stops

  12. Entropy-const. OC-based coding (3) RD-OMP • Input: Dictionary , Input signal • Output: coefficient vector • For every basis k (from 1 to K) • calculate • Pick coefficient with smallest • Stop if , otherwise go to 1.

  13. Entropy-const. OC-based coding (4) • EC Dictionary Learning – keyideas • Dictionary update strategy • K-SVD modifies dictionary and coefficients- reduction in Lagrangiancostis not assured. • We use MOD, whichprovides the optimal adjustmentassumingfixed coefficients • Introduction of “Rate cost update” stage • Analogous to ECVQ algorithm for training data • Twopmfs must beupdated: indexes and coefficients

  14. Entropy-const. OC-based coding (5) EC-Dictionary Learning • Input: input signal y • Output: Dictionary • Initialize from • Sparse coding stage: • RD-OMP  findcoefficient • Rate cost update stage: • pmfs update (indexes and coefficients) • Codeword length update: • Dictionary update stage: • MOD dictionary update • Stop when , Otherwise go to 2

  15. Experiments (Setup) • Rate calculation: optimal codebook (entropy) for each subband • Test images: Lena, Boats, Harbour, Peppers • Training dictionary experiments • Training data: 18 Kodak downsampled (to 128x128) images (does not include images being coded) • Use of downsampledimages to 128x128, due to very high computational complexity (for other experiments, higher resolutions were employed: 512x512, 256x256)

  16. Experiments (Sparse Coding) • Comparison of Sparse coding methods

  17. Experiments (Dict. learning) • Comparison of dictionary learning methods

  18. Experiments (Compression schemes) (1) • 1: Training and coding for the same image (dictionary is sent) • 2: Training with a set of natural images and applying to other images

  19. Experiments (Compression schemes) (2)

  20. Experiments (Compression schemes) (3)

  21. Conclusion • Improvement of sparse coding: • RD-OMP • Improvement of dictionary learning • Entropy-constrained overcomplete dictionary learning • Better overall performance compared to standard techniques

  22. Future work • Extension of implementation to higher resolution images • Further investigation of trade-off between K and N • Evaluation against directional transforms • Low complexity implementation of the algorithms

More Related