1 / 46

Grouping and Segmentation

Grouping and Segmentation. Sometimes edge detectors find the boundary pretty well. Sometimes it’s not enough. Improve Boundary Detection. Integrate information over distance. Piece together curve fragments from Edge Detector into longer curves Get User to Help?.

zita
Download Presentation

Grouping and Segmentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grouping and Segmentation

  2. Sometimes edge detectors find the boundary pretty well.

  3. Sometimes it’s not enough.

  4. Improve Boundary Detection • Integrate information over distance. • Piece together curve fragments from Edge Detector into longer curves • Get User to Help?

  5. Humans integrate contour information.

  6. Canny edges

  7. Canny edges Can we find the long straight lines from result of edge detection?

  8. Hough Transform Can we detect this straight line?

  9. Hough Transform All possible lines going through given image points

  10. Hough Transform All possible lines going through given image points Each image point gives one curve (showing all lines through that point)

  11. Hough Transform All possible lines going through given image points

  12. Hough Transform All possible lines going through given image points

  13. Hough Transform All possible lines going through given image points

  14. Hough Transform All possible lines going through given image points

  15. Hough Transform All possible lines going through given image points Each image point gives one curve (showing all lines through that point) This line passes through every image point

  16. Hough Transform All possible lines going through given image points • Idea • Each image point votes for all • lines passing through it. • All points on a line vote for it, • so it gets many votes and stands out

  17. Hough Transform All possible lines going through given image points • Idea • Each image point votes for all • lines passing through it. • All points on a line vote for it, • so it gets many votes and stands out Accumulate votes in bins

  18. Hough Transform All possible lines going through given image points Line with most votes (all 21 pts) Accumulate votes in bins

  19. Hough Transform Noisy line

  20. Hough Transform Noisy line

  21. Hough Transform Noisy line Winning bin has 18/21 votes

  22. Hough Transform No line, just random points

  23. Hough Transform No line, just random points

  24. Hough Transform No line, just random points

  25. Hough Transform No line, just random points Still 7 votes!

  26. HoughTransform Both line and random points

  27. HoughTransform Both line and random points

  28. HoughTransform Both line and random points

  29. HoughTransform Both line and random points Still many bins with over 10 votes Histogram of vote numbers

  30. HoughTransform Lines for bins with 10 or more votes

  31. Hough Transform Problems • Bin size chosen too big: • Inaccurate lines • Can’t distinguish very different lines • Get fake lines from noise • Bin size chosen too small: • Too few votes per bin, can miss the true lines • How many lines? Where to set the cutoff on vote number?

  32. Edge detection Votes All lines through edge points

  33. Top 10 lines (most votes)

  34. Top 10 lines (most votes) • Why aren’t the straight lines detected? • Too much noise • Lines are too short • (longer lines through noise points • accumulate more votes) • We didn’t consider gradient direction

  35. Hough Transform: Robust Statistics • Fitting a line in noise

  36. Hough Transform: Robust Statistics • Fitting a line in noise Line points: Inliers

  37. Hough Transform: Robust Statistics • Fitting a line in noise Unrelated random noise: outliers Noisy line points: Inliers

  38. Hough Transform: Robust Statistics • Fitting a line in noise • Very important problem in vision and elsewhere

  39. Hough Transform: Robust Statistics • Fitting a line in noise • Can’t use least squares to fit line to points: which points to fit??? Reminder least-squares fit (for best line): find a,b that minimize

  40. Hough Transform: Robust Statistics • Fitting a line in noise • Hough transform solves this problem.

  41. Hough Transform: Robust Statistics • Hough transform • Fitting a line or other structures (circle, rectangle, figure eight…) in noise • BUT Hough often works badly • Too sensitive to parameter choices • Can miss lines or find wrong ones • Better for finding very constrained structures Eg, Vertical figure eights of known size, 2 perfect aligned equal circles… • Better solution for finding structures in noise: RANSAC

  42. RANSAC for lines • Finding a line in noise: RANSAC For K iterations: Choose two data points randomly. Compute the line passing through them. Count all data points within threshold distance D of this line. If Count larger than C, Store data points close to line. Use least-squares fit to find best line through these points. Store this line. Choose line giving largest Count. (Or, line with smallest least-squares error in fit to its set of close points)

  43. RANSAC for lines • Finding a line in noise: RANSAC For K iterations: Choose two data points randomly. Compute the line passing through them. Count all data points within threshold distance D of this line. If Count larger than C, Store data points close to line. Use least-squares fit to find best line through these points. Store this line. Choose line giving largest Count. (Or, line with smallest least-squares error in fit to its set of close points) • Why does it work? After K iterations, chance of choosing a bad point each time: is fraction of “good” points in data set.

  44. RANSAC for lines • Finding a line in noise: RANSAC For K iterations: Choose two data points randomly. Compute the line passing through them. Count all data points within threshold distance D of this line. If Count larger than C, Store data points close to line. Use least-squares fit to find best line through these points. Store this line. Choose line giving largest Count. (Or, line with smallest least-squares error in fit to its set of close points) • Why does it work? After K iterations, chance of choosing a bad point each time: is fraction of “good” points in data set. so chance of RANSAC success = 99.7%

  45. RANSAC for lines • Finding a line in noise: RANSAC For K iterations: Choose two data points randomly. Compute the line passing through them. Count all data points within threshold distance D of this line. If Count larger than C, Store data points close to line. Use least-squares fit to find best line through these points. Store this line. Choose line giving largest Count. (Or, line with smallest least-squares error in fit to its set of close points) • Many variations, related algorithms Preemptive RANSAC, MLESAC,…

More Related