1 / 45

Agenda

Agenda. Review of Proj 1 Concepts for Proj 2 Probability Morphology Image Blending. Proj 1. Grades in EEE Dropbox Out of 40 points; followed breakdown on project webpage Soln is also in dropbox I strongly suggest you look at them

toyah
Download Presentation

Agenda

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda • Review of Proj 1 • Concepts for Proj 2 • Probability • Morphology • Image Blending

  2. Proj 1 • Grades in EEE Dropbox • Out of 40 points; followed breakdown on project webpage • Soln is also in dropbox • I strongly suggest you look at them • I still learn ‘tricks of the trade’ by looking at other people’s code

  3. Demosaicing Look at code

  4. whiteBalance.mclipHistogram.mgammaCorrectOrig.mgammaCorrect.mwhiteBalance.mclipHistogram.mgammaCorrectOrig.mgammaCorrect.m Someone in class had a better implementation!

  5. Agenda • Review of Proj 1 • Concepts for Proj 2 • Probability • Morphology • Image Blending

  6. Discrete Probability • We have a biased 6-sixed die. How do we encode the chances of possible rolls?

  7. Discrete Probability • We have a biased 6-sixed die. How do we encode the chances of possible rolls? • Table of 6 #’s that add to 1 • Write formally as P(R=1)=.1, P(R=2) = .2, ….. • Here, R is a discrete random variable • The function P(.) is sometimes called a probability mass function (pmf) • In continuous world, its called a pdf

  8. Discrete Probability • Assume we have a rand() function that can generate a random # in [0,1] • Given pmf, how we can now generate sample rolls according to P(R=1)=.1, P(R=2) = .2…?

  9. Discrete Probability • Given pmf, how we can now generate sample rolls according to P(R=1)=.1, P(R=2) = .2…? • Idea: stack line segments of length .1, .2, … • Total height of stack = 1 • Randomly pick a height & see what segment we hit • In matlab, sample = find(cumsum(pmf) >= rand,1)

  10. Image histogram • Consider a grayscale image with image intensities in {0,255} • Construct a table of 256 values. Normalize so that the entries add to 1 • We can interpret histogram as a pmf • P(I=0)=.0006, P(I=1)=.003,….. • What would a ‘sample’ image look like? Not so good, so let’s come up with a stronger model….

  11. Joint probability tables • Let’s say we have 2 “loaded” die • Define a joint pmf by a table of 6x6 #s that add to 1 • P(R1,R2)=

  12. Joint probability tables R1 • P(R1,R2)= R2 What’s P(R1=k) for k = 1..6? What’s P(R2=k) for k = 1..6? The marginals P(R1),P(R2) look fair, but the joint P(R1,R2) is clearly biased

  13. Conditional Probability Tables (CPT) • P(R1|R2) = P(R1,R2)/P(R2) R1 R2 What’s P(R1=x|R2=y) for x = 1..6, y=1..6? Does the CPT contain the same information as the joint? What if we knew both the P(R1,R2) and P(R2) tables?

  14. Shannon’s language model • P(word|word_prev) = word word_prev What would be reasonable CPT values?

  15. Nth order markov models • P(word|history) = P(word|word_prev1,word_prev2) word word_prev1, word_prev2 What would be reasonable CPT values?

  16. Nth order markov models • P(pixel|image) = P(pixel|neighborhood) pixel Neighborhood How big is this table for a NxN window?

  17. Statistical modeling of texture • Assume stochastic model of texture (Markov Random Field) • Stationarity: the stochastic model is the same regardless of position • Markov property: p(pixel | rest of image) = p(pixel | neighborhood) ?

  18. non-parametric sampling p Input image • Instead of explicitly defining P(pixel|neighorhood), we search the input imagefor all sufficiently similar neighborhoods (and pick one match at random) Implicit models Synthesizing a pixel

  19. Agenda • Review of Proj 1 • Concepts for Proj 2 • Probability • Morphology • Image Blending

  20. Morphology • Operations on binary images: I(x,y) in {0,1} Binary(x,y) = 1 if Grayscale(x,y) > threshold 0 otherwise Why do this? Less bits to encode image There are many operations we can perform on binary images

  21. Example: dilation Mask (below) is often called “structuring element”

  22. Connectedness Dilation: “I should turn on if any of my neighbors are turned on” How do we define neighbor? 8-connected neighbors 4-connected neighbors

  23. Example: finding borders of objects Image Dilated with 3x3 mask Dilated - Image

  24. Grayscale dilation

  25. Erosion • “I should turn off if any of my neighbors are off” • Can we implement with dilation?

  26. Morphological opening • Open(Image) = Erode(Dilate(Image)) Binary hole filling

  27. Agenda • Review of Proj 1 • Concepts for Proj 2 • Probability • Morphology • Image Blending

  28. Problems with direct cloning From Perez et al. 2003

  29. Solution: clone gradient

  30. Idea; we’ll manipulate gradient field of image rather than image How can we compute df/dx & df/dy with a convolution (what are the filters)?

  31. Seamless Poisson cloning • Given vector field v (pasted gradient), find the value of f in unknown region that optimize: Pasted gradient Mask unknownregion Background

  32. Membrane interpolation • What if v is null? • Laplace equation (a.k.a. membrane equation )

  33. Let’s consider this problem in 1-D • Min (df/dx)^2 such that f(a) = x1, f(b) = x2 • Using calculus of variations (we want minimum, so differentiate above function and set = 0), we’ll find that df^2/dx^2 (second derivative) must be 0 everywhere • Makes sense; if f(x) was curving, the total sum of slope^2 could be made smaller x1 x2

  34. In 2D: membrane interpolation x1 x2

  35. p q Discrete Poisson solver Discretized gradient • Minimize variational problem • Rearrange and call Np the neighbors of p • Big yet sparse linear system Discretized v: g(p)-g(q) Boundary condition (all pairs that are in ) Only for boundary pixels

  36. Discrete Poisson solver Knowns: g = desired pixels (v = gp–gq) f* = constrained border pixels Np = list of neighbors of pixel p Unknowns: f = reconstructed pixels

  37. Discrete Poisson solver Knowns: g = desired pixels (v = gp–gq) f* = constrained border pixels Np = list of neighbors of pixel p Unknowns: f = reconstructed pixels Write equation as Af = b A is a large space matrix mostly of 0s,+-1s b is a vector of the righthandside of the above eqn Matlab can solve (for small images) in a single command: f = b\A

  38. Result (eye candy)

  39. Manipulate the gradient • Mix gradients of g & f: take the max

  40. Photomontage • http://grail.cs.washington.edu/projects/photomontage/photomontage.pdf

More Related