Loading in 2 Seconds...

(1) A probability model respecting those covariance observations: Gaussian

Loading in 2 Seconds...

- 96 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about '(1) A probability model respecting those covariance observations: Gaussian' - meiying

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

(1) A probability model respecting those covariance observations: Gaussian

- Maximum entropy probability distribution for a given covariance observation (shown zero mean for notational convenience):
- If we rotate coordinates to the Fourier basis, the covariance matrix in that basis will be diagonal. So in that model, each Fourier transform coefficient is an independent Gaussian random variable of covariance

Inverse covariance matrix

Image pixels

Power spectra of typical images

Experimentally, the power spectrum as a function of Fourier frequency is observed to follow a power law.

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Random draw from Gaussian spectral model

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Noise removal (in frequency domain), under Gaussian assumption

Observed Fourier component

Power law prior probability on estimated Fourier component

Estimated Fourier component

Posterior probability for X

Variance of white, Gaussian additive noise

Setting to zero the derivative of the the log probability of X gives an analytic form for the optimal estimate of X (or just complete the square):

Noise removal, under Gaussian assumption

(1) Denoised with Gaussian model, PSNR=27.87

With Gaussian noise of std. dev. 21.4 added, giving PSNR=22.06

original

(try to ignore JPEG compression artifacts from the PDF file)

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

(2) The wavelet marginal model

Histogram of wavelet coefficients, c, for various images.

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Parameter determining peakiness of distribution

Parameter determining width of distribution

Wavelet coefficient value

Random draw from the wavelet marginal model

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Image statistics (or, mathematically, how can you tell image from noise?)

Noisy image

By definition of conditional probability

P(x, y) = P(x|y) P(y)

The parameters you want to estimate

Constant w.r.t. parameters x.

Likelihood function

Prior probability

What you observe

Bayes theoremP(x, y) = P(x|y) P(y)

so

P(x|y) P(y) = P(y|x) P(x)

P(x, y) = P(x|y) P(y)

so

P(x|y) P(y) = P(y|x) P(x)

and

P(x|y) = P(y|x) P(x) / P(y)

Using that twice

y

P(x)

P(y|x)

P(y|x)

P(x|y)

Bayesian MAP estimator for clean bandpass coefficient valuesLet x = bandpassed image value before adding noise.

Let y = noise-corrupted observation.

By Bayes theorem

y = 25

P(x|y) = k P(y|x) P(x)

P(x|y)

Bayesian MAP estimator

Let x = bandpassed image value before adding noise.

Let y = noise-corrupted observation.

By Bayes theorem

y = 50

P(x|y) = k P(y|x) P(x)

y

P(y|x)

P(x|y)

Bayesian MAP estimator

Let x = bandpassed image value before adding noise.

Let y = noise-corrupted observation.

By Bayes theorem

y = 115

P(x|y) = k P(y|x) P(x)

y

P(y|x)

P(x|y)

P(x|y)

P(x)

P(y|x)

y

y = 25

y = 115

y

P(y|x)

For small y: probably it is due to noise and y should be set to 0

For large y: probably it is due to an image edge and it should be kept untouched

P(x|y)

MAP estimate, , as function of observed coefficient value, y

Simoncelli and Adelson, Noise Removal via Bayesian Wavelet Coring

http://www-bcs.mit.edu/people/adelson/pub_pdfs/simoncelli_noise.pdf

(1) Denoised with Gaussian model, PSNR=27.87

With Gaussian noise of std. dev. 21.4 added, giving PSNR=22.06

original

(2) Denoised with wavelet marginal model, PSNR=29.24

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

M. F. Tappen, B. C. Russell, and W. T. Freeman, Efficient graphical models for processing images IEEE Conf. on Computer Vision and Pattern Recognition (CVPR) Washington, DC, 2004.

Motivation for wavelet joint models

Note correlations between the amplitudes of each wavelet subband.

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Statistics of pairs of wavelet coefficients

Contour plots of the joint histogram of various wavelet coefficient pairs

Conditional distributions of the corresponding wavelet pairs

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Gaussian scale mixture model simulation

observed

(3) Gaussian scale mixturesWavelet coefficient probability

A mixture of Gaussians of scaled covariances

z is a spatially varying hidden variable that can be used to

(a) Create the non-gaussian histograms from a mixture of Gaussian densities, and (b) model correlations between the neighboring wavelet coefficients.

(1) Denoised with Gaussian model, PSNR=27.87

(2) Denoised with wavelet marginal model, PSNR=29.24

With Gaussian noise of std. dev. 21.4 added, giving PSNR=22.06

original

(3) Denoised with Gaussian scale mixture model, PSNR=30.86

http://www.cns.nyu.edu/pub/eero/simoncelli05a-preprint.pdf

Download Presentation

Connecting to Server..