1 / 21

Computational Agriculture: Video Monitoring of Honey Bees Subhabrata Bhattacharya (Subh) Intel Mentors: L. Mummert R.

Computational Agriculture: Video Monitoring of Honey Bees Subhabrata Bhattacharya (Subh) Intel Mentors: L. Mummert R. Sukthankar J. Campbell. 07/08/2010. About Me. School : University of Central Florida Advisor(s) : Prof. Mubarak Shah, Dr. Rahul Sukthankar Thesis topic: Not decided yet

enrique
Download Presentation

Computational Agriculture: Video Monitoring of Honey Bees Subhabrata Bhattacharya (Subh) Intel Mentors: L. Mummert R.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computational Agriculture: Video Monitoring of Honey Bees Subhabrata Bhattacharya (Subh)Intel Mentors: L. MummertR. SukthankarJ. Campbell 07/08/2010

  2. About Me • School : University of Central Florida • Advisor(s) : Prof. Mubarak Shah, Dr. Rahul Sukthankar • Thesis topic: Not decided yet • A little-known fact about me: I am an outdoor adventure enthusiast. 2

  3. Motivation • Bees are critical for agricultural pollination • ~130 crop, USD 15 bn in US in fruits, veggies, textiles, nuts etc. • Increasing pollination demands: almond 1.5M colonies by 2010, apples, plums … • Populations are in decline* • 1996 : feral bees virtually eliminated by varroa destructor • 2004: 50% decrease of managed colonies over past 50 yrs • 2006: cases of colony collapse disorder *USDA Agricultural research Service, 5/2008 3

  4. Provide colony information Stinging • Previous Approaches: • Manual Inspection • Photo-electrical counter [Sprangler et al.’69] • LASER Bar-code scanner [Sasaki et al.’89, Sandford et al.’89] • Infrared counter [Struye’94] • Mechanically tagged inspection [Landgraf et al.’07] Invasive Bees Entering Bees Exiting Tedious/ Impractical 4

  5. A cheaper alternative • Bee-hive monitoring using Computer vision [Campbell et al.08] • Non-invasive technique • Commodity sensor hardware (camera) 5

  6. The Research Question that I Hope to Answer “How to extend the video monitoring capabilities of the existing system to efficiently detect and track bees?” • Difficult: more shadow artifacts, occlusion, clutter, • Easy: Clean background, fewer bees, few shadow artifacts • Moderately difficult: cluttered background, more bees, shadow artifacts 6

  7. The Biggest Challenges in Answering this Question • Standard Computer Vision challenges • Clutter (moving foliage, debris) • Outdoor Illumination variation • Shadows • Orientation changes • Size changes (perspective effects) 7

  8. The Biggest Challenges in Answering this Question • Domain-specific computer vision challenges • Background subtraction difficult (guard bees) • Part based detectors fail (not enough shape info) • Blob matching difficulty (Similar Shape/color) • Optical flow only good for few pixels (here motion ~100px/frame) 100px 8

  9. My Approach to Answering this Question Input Video Arrival/Departure Counts Candidate blob Filtering Blob Matching Trajectory Analysis Motion Detection All moving blobs Filtered Blobs Trajectories 9

  10. Motion Based Detection • What methods we tried • GMM bg subtraction [Stauffer & Grimson, CVPR98] • Pixel Discrimination bg subtraction [Li et al., ACMMM03] • Pyramid based bg subtraction • Accumulative frame differencing (a.f.d) [Ali & Shah, SPIE06] • What method worked best • Pyramid bg subtraction (robust to large motion) + a.f.d (robust to local intensity changes) • 2 Pyramid levels for bg subtraction • 5 frame temporal window for a.f.d 10

  11. Candidate Blob Filtering • What methods we tried • Template matching [Campbell et al., VAIB08] • Viola-Jones face detector based cascade classifier trained on bee/non-bee patches [Viola & Jones, IJCV01] • Area/Size based thresholds • Train SVM classifier bee/non-bee pixels’ RGB intensity values • Matching color histogram of bee-pixels • What worked best • Size-based Threshold to filter small noisy blobs (usually shadows, leaves) • Compare histogram distances against threshold False Blobs True Blobs Bee Pixels non-Bee Pixels 11

  12. Blob Association/Obtaining Trajectories • What methods we tried • Particle filter based tracking – nonlinear state estimation of blobs • Blob Tracking – matching similar blobs • What worked best • Extract blob signature : Size, Eccentricity, Orientation, Displacement from entry • Assign blob using greedy search • Generate trajectories 12

  13. Qualitative Results of Trajectory Generation • Difficult: more shadow artifacts, occlusion, clutter • Easy: Clean background, fewer bees, few shadow artifacts • Moderately difficult: cluttered background, more bees, shadow artifacts 13

  14. Evaluation Metrics for Quantitative Analysis • Performance of detection • MODP - Multiple Object Detection Precision* • CPD - Cumulative Probability of Detection (Easy to visualize) • CFAR - Cumulative False Alarm Rate (Easy to visualize) • Performance of tracking • MOTP - Multiple Object Tracking Precision* *CLEAR MOT metrics [Bernardin,JIVP08 ] 14

  15. Analysis of CPD, CFAR on Easy Dataset Better Better • 94.19% True detections, 1 False positive every two frames • Color Histogram matching reduces number of False alarms • Only Bg Subtraction or only a.f.d techniques – not reliable 15

  16. Analysis of CPD, CFAR on Intermediate Dataset Better Better • 84.66% True detections, ~2 False positives every frame • 5-6 guard bees/frame (little motion), detection failed: Only appearance based detector (sliding window) might help 16

  17. Analysis of CPD, CFAR on Difficult Dataset Better Better • 72.00% True detections, ~8 False positives detected every frame • More guard bees/frame (little motion), detection failed • More clutter – blobs merge, bigger blobs filter out reducing true detections 17

  18. Normalized MODP for different methods a - Accumulative Frame Differencing [Ali & Shah, SPIE06] b - Background subtraction [Stauffer & Grimson, CVPR98] c - Background subtraction [Li et al., ACMMM03] d - Color SVM bee/non-bee classification e - Template matching (correlation) [Campbell et al., VAIB08] f -Haar features, cascade [Viola & Jones, IJCV01] g - Color Histogram, Bhattacharyya Distance Matching 18

  19. Preliminary Tracking Results • Normalized Multiple Object Tracking Precision (n-MOTP) • Fewer annotations for tracking (for the two harder datasets) *http://web.engr.oregonstate.edu/~hess/downloads/track.tar.gz 19

  20. Criteria for Success 75% Goal: • Comparative evaluation of the detection techniques (9 techniques tried) • Implement PF or HMM based tracker (PF tracker) 100% Goal: • Detection accuracy ~ 95% (Easy : 94.12%) • Counting (detection + tracking) accuracy ~ 90%* ( Needs improvement) 125% (aka Stretch) Goal: • High performance implementation (i.e. non-MATLAB) (C/OpenCV based detector/Tracker) • Determine the resources required to achieve real-time performance (NOT DONE) • * Extremely ambitious goal! 20

  21. Thank you! 21

More Related