1 / 67

Introduction to Computer Vision Lecture 9

Introduction to Computer Vision Lecture 9. Roger S. Gaborski. 1. Extract and Analyze Brandy. im=imread('IMGP1715.JPG'); >> imSm = imresize(im, .25); >> figure,imshow(imSm). Roger S. Gaborski. 2. Approaches. Gray scale thresholding . Roger S. Gaborski. 3. 1.Gray scale thresholding.

dash
Download Presentation

Introduction to Computer Vision Lecture 9

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Computer VisionLecture 9 Roger S. Gaborski 1

  2. Extract and Analyze Brandy im=imread('IMGP1715.JPG'); >> imSm = imresize(im, .25); >> figure,imshow(imSm) Roger S. Gaborski 2

  3. Approaches Gray scale thresholding Roger S. Gaborski 3

  4. 1.Gray scale thresholding • Approach – First convert to gray scale (losing color • information), then threshold • >> imSmGray = rgb2gray(imSm); • >> imSmGray = im2double(imSmGray); • >>figure, imshow(imSmGray) • >>figure, imshow(im2bw(imSmGray,graythresh(imSmGray))) Roger S. Gaborski 4

  5. Clearly, unsuccessful WHY DID IT FAIL?? The intensity value of pixels of Brandy is very close to intensity values of pixels of the background, which makes it hard to segment based on intensity distribution. Roger S. Gaborski 5

  6. Grayscale Histogram >> max(imSmGray(:)) ans = 0.9804 >> min(imSmGray(:)) ans = 0.0510 >> figure, imhist(imSmGray) No clear separation line of dog and background Roger S. Gaborski 6

  7. Approaches Gray scale thresholding Detect edges and then segment Roger S. Gaborski 7

  8. 2. Edge Detection: Sobel Roger S. Gaborski 8

  9. 2. Edge Detection: Laplacian of Gaussian Roger S. Gaborski 9

  10. 2. Edge Detection: Canny Roger S. Gaborski 10

  11. Reason of the failures • Gray scale thresholding and edge detection: • Both these 2 algorithms work in gray scale space, only taking into account of intensity values of the pixel. However, the intensity value of the dog and the grass is very similar to each other, which makes the noise very hard to eliminate. The edge detection algorithms also fail in this case. • They ignore the most informative component: distinct colors of the brown dog and the green grass

  12. Approaches Gray scale thresholding Detect edges and then segment Color segmentation Color spaces : RGB Euclidean distance Mahananobis distance Roger S. Gaborski 12

  13. 3. Color Segmentation: Euclidean Distance Manually select pixels Roger S. Gaborski 13

  14. 3. Color Segmentation: Mahalanobis Distance Brandy Noise from brown earth Manually select pixels Roger S. Gaborski 14

  15. Original Brandy picture Have very similar color with brandy Roger S. Gaborski 15

  16. Discussion None of the 3 planes will work for the segmentation of Brandy and the grass. However, by combining B, G, and B planes together, we can roughly segment Brandy from the grass by Euclidean distance and achieve desirable segmentation by Mahalanobis distance (taking into account of correlations between different color planes). Roger S. Gaborski 16

  17. Individual Color Planes >> figure, subplot(2,3,1),imshow(imSm(:,:,1)),title('Red') >> subplot(2,3,2),imshow(imSm(:,:,2)),title('Green') >> subplot(2,3,3),imshow(imSm(:,:,3)),title('Blue') >> subplot(2,3,4),imshow(im2bw(imSm(:,:,1),graythresh(imSm(:,:,1)))) title('Red Threshold') >> subplot(2,3,5),imshow(im2bw(imSm(:,:,2),graythresh(imSm(:,:,2)))) title('Green Threshold') >> subplot(2,3,6),imshow(im2bw(imSm(:,:,3),graythresh(imSm(:,:,3)))) title('Blue Threshold') Roger S. Gaborski 17

  18. HSV Color Space Seems doesn’t work when combining HSV together >> imH = rgb2hsv(imSm); >> figure, imshow(imH) Roger S. Gaborski 18

  19. Distinct hues of brown color (brandy) and green color (grass) Perfect for separating the dog from the background Hard to distinguish >> figure, subplot(1,3,1),imshow(imH(:,:,1)),title('Hue') >> subplot(1,3,2),imshow(imH(:,:,2)),title('Saturation') >> subplot(1,3,3),imshow(imH(:,:,2)),title('Value') Roger S. Gaborski 19

  20. Imhist(imH(:,:,1)) Grass distribution Dog distribution Separating line Histogram distribution in Hue space Roger S. Gaborski 20

  21. Dog pixels gray level value = 0 Still, very similar hue with Brandy >> level = graythresh(imH(:,:,1)) level = 0.1725 (automatic threshold) >> figure, imshow(imH(:,:,1)>level) Roger S. Gaborski 21

  22. Summary Color is the ideal descriptor in segmenting Brandy from the grass (distinct colors) Edge detection algorithms fail when the intensity values of adjacent pixels are very similar with each other We will continue with color segmentation and morphological processing in next lecture Follow up assignments on region growing and color segmentation will be posted on course website shortly. You will be informed when they are posted.

  23. Brandy

  24. RainGirl Color Segmentation

  25. Approaches • We look at two approaches for color segmentation: • Region segmentation using a distance measurements • Region growing using seeds and a distance measurement

  26. Distance Measurement

  27. Distance Map

  28. Distance Threshold <.15

  29. Distance Threshold <.25

  30. L*a*b Color Space • 'L*' luminosity or brightness layer, • 'a*' chromaticity layer indicating where color falls along the red-green axis • 'b*' chromaticity layer indicating where the color falls along the blue-yellow axis.

  31. Distance Measure in L*a*b Space • Only use a and b planes • Manually sample image • Estimate mean of a and b values • Calculate distance as before

  32. Distance Map for L*a*b Space

  33. distMeasure < 40

  34. distMeasure < 30

  35. distMeasure < 20

  36. distMeasure < 10

  37. K-means Clustering • MATLAB: IDX = kmeans(X,k) partitions the points in the n-by-p data matrix X into k clusters. • How many clusters? k • Distance measure: Euclidean

  38. Very Simple Example • Consider 1 dimensional data (algorithm works with n dimensional data Assume k = 2 Assume cluster centers, randomly pick 2 values

  39. Very Simple Example • Measure distance between centers and remaining points Assign points to closer center Recalculate centers based on membership 1 = 1 (2+3+5+6+7+8+9)/7 = 5.7143

  40. Very Simple Example 1.000 5.7143 Assign points to closer new center Recalculate centers based on membership (1+2+3)/3 = 2.0 (5+6+7+8+9)/5 = 7.0

  41. Very Simple Example No points reassigned, done Z1 = 2.0 Z2 = 7.0

  42. K-means Clustering • K-means algorithm has self organizing properties • n-dimensional vectors may be considered points in a n-dimensional Euclidean space By a Euclidean space we mean a Rn space with a definition of distance between vectors x and y as: d(x,y) = sqrt{ (x1-y1)2+(x2-y2)2+…+(xn-yn)2} Euclidean norm or length of x ||x|| = sqrt{ x12 + x22 +…+xn2 } • K-means is one of many techniques that uses the notion of clustering by minimum distance Why does using minimum distance make sense?

  43. K-means Clustering • Two vectors that represent points in n space that are geometrically close may in some sense belong together • Notation: • Norm or length of vector x: ||x||=  xi2 • Distance between two vectors: ||x-z||= ( xi-zi)2 1/2 i 1/2 i

  44. K-Means Algorithm • We measure how close vectors are • We establish cluster points and partition vectors into these clusters such that the distance between a vector and the cluster it is assigned to is minimum with regards to the other points • With k-means you need to know the number of cluster centers.

  45. K-Means Algorithm • Set of input vectors: {x(1), x(2),…,x(p) • z represents cluster center for each of k clusters. It points to the position in Euclidean space which the cluster center is located, since there are k centers, there are z1, z2, … zk • Sj = { } represents set of samples that belong to jth cluster

  46. Procedure • Initialize Choose number of clusters, k For each of k clusters, choose an initial center { z1(l), z2(l), … zk(l) }, where zj(l) represents the value of the jth cluster at the lth iteration 2. Distribute all the sample vectors Assign each sample vector x(p) to one of the cluster algorithms: x(p)  Sj(l) if || x(p)-zj(l) || < || x(p) – zi(l) For all I=1,2,3,…k, for i  j

  47. Procedure 2 • Calculate new cluster centers Using new cluster membership, recalculate each center such that the sum of distance from each member to the new cluster is minimized. zj(l+1) = 1/Nj x(p) Nj is the number of samples associated to Sj If zj(l+1) = zj(l), no cluster center has changed. Otherwise, go to step 2 and reassign vectors x(p) sj(l)

  48. Use k-means to cluster pixelsDoes not require samples

  49. Convert RGB color image to L*a*b color space • Classify pixels in L*a*b space using k-means clustering • Ignore L information, use a and b • Label pixels and display

  50. 3 Clusters

More Related