1 / 21

Dataset Production and Performance Evaluation for Event Detection and Tracking

Dataset Production and Performance Evaluation for Event Detection and Tracking. Paul Hosmer Detection and Vision Systems Group. Outline. Defining a requirement What to include in datasets Constraints Evaluation and Metrics Case Study. Background. Intelligent Video

chenoa
Download Presentation

Dataset Production and Performance Evaluation for Event Detection and Tracking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dataset Production and Performance Evaluation for Event Detection and Tracking Paul Hosmer Detection and Vision Systems Group

  2. Outline • Defining a requirement • What to include in datasets • Constraints • Evaluation and Metrics • Case Study SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  3. Background Intelligent Video • Started in early 1990’s • FABIUS • Amethyst • Through to 2000’s • VMD capability study • Standards-based evaluations SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  4. What did we want to achieve? • Test systems in short period of time • Provide data and requirements to research community Dataset production Problem: what to include? SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  5. Scenario definition • What is an event? • Where does the scenario take place? • What challenges are posed by the environment? Ask end users / gauge demand Conduct capability study Monitor environment, apply a priori knowledge SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  6. Scenario definition • Abandoned Baggage • When is an object abandoned? • What types of object? • Attributes of person? SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  7. Scenario definition “Abandoned object” • During the current clip, a person has placed an object which was in their possession when they entered the clip onto the floor or a seat in the detection area & • That person has left the detection area without the object & • Over sixty seconds after they left the detection area, that person has still not returned to the object & • The object remains in the detection area. SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  8. Scenario definition • Key environmental factors: • Lighting changes – film dawn and dusk • Rain and snow • Night – head lights and low SNR SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  9. How much data? • Need to demonstrate performance on wide range of imagery • Statistical significance • Need large training and test corpus – 100’s of events • Unseen data for verification SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  10. Constraints • You can’t always capture the event you want – simulation • Make simulations as close to the requirement as possible • Storage vs image quality – what will you want to do with the data at a later time? • Cost – try to film as much variation/events as you can SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  11. Performance Evaluation • Importance of metrics – consistency across different evaluations • When is an event detected? • Real time evaluation, 10x real time, offline… which is most useful? • Statistically significant unseen dataset: Performance on training data does not tell me anything useful about robustness SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  12. How HOSDB does it • Simulate real analogue CCTV system • ~ 215,000 frames per scenario evaluation • ~ Evaluation 300 events • 60s to alarm after GT alarm condition is satisfied • One figure of merit for ranking SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  13. R = TP TP+FN where P = TP TP+FP F1 score for event detection F1 = (α + 1)RP R + αP α ranges from 0.35 to 75 depending on scenario and application SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  14. What about Tracking? 5th i-LIDS scenario • Multiple Camera Tracking • Increasing interest from end users • Significant potential to enhance operator effectiveness and aid post event investigation The Problem • Unifying tracking labelling across multiple camera views Dataset and Evaluation Problem • Synchronisation SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  15. Operational Requirement Camera Requirements: • Existing CCTV systems • Cameras are a mixture of overlapping and non-overlapping • Internal cameras are generally fixed and colour Scene Contents: • Scenes are likely to contain rest points • Varying traffic densities Target Description: • There may be multiple targets • Targets from wide demographic SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  16. Imagery Collection SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  17. Imagery Collection Volume • 5 cameras • 1.35 Million frames • Single and multiple target • 1000+ target events • 1TB external HDD Location • Large Transport Hub (airport) Targets • Varied Targets • Differing target behaviour • Varying crowd densities Environment • Lighting changes • Filmed at Dawn, Day, Dusk and Night SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  18. SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  19. Mixed Stage Overlapping Stage Non-Overlapping Stage Dataset structure Target Event Set TES Properties Daytime High density Night time Low density Etc Etc MCT01 MCT02 MCT03 SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  20. Performance Metric P = Overlapping Pixels Total Track Pixels R = Overlapping Pixels Total Ground Truth Pixels F1 = 2RP R+P • F1 must be greater than or equal to 0.25 for the track to be a True Positive SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

  21. Performance evaluation is important • Evaluations need to use more data • With richer content • With widely accepted definitions and metrics • Demonstrate improved performance SCIENTIFIC DEVELOPMENT BRANCH PAUL HOSMER BMVA Performance Evaluation Symposium 2007

More Related