1 / 44

PETS International Workshops on Performance Evaluation of Tracking and Surveillance

PETS International Workshops on Performance Evaluation of Tracking and Surveillance. James Ferryman Computational Vision Group Department of Computer Science The University of Reading, UK. PETS International Workshops on Performance Evaluation of Tracking and Surveillance. Supported by.

amal
Download Presentation

PETS International Workshops on Performance Evaluation of Tracking and Surveillance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PETSInternational Workshops on Performance Evaluation of Tracking and Surveillance James Ferryman Computational Vision Group Department of Computer Science The University of Reading, UK ETISEO, Nice, May 11-12 2005

  2. PETSInternational Workshops on Performance Evaluation of Tracking and Surveillance Supported by ETISEO, Nice, May 11-12 2005

  3. Introduction • Theme - Performance Evaluation of Tracking and Surveillance • Successful tracking of object motions key to visual surveillance • PETS started in Grenoble, France in 2000 as satellite workshop of FG2000 • Not a competition • http://visualsurveillance.org • ftp://pets.rdg.ac.uk ETISEO, Nice, May 11-12 2005

  4. PETS - History • PETS’2000 was held at FG’2000, 31 March 2000, Grenoble, France. • PETS’2001 at CVPR’01. • PETS’2002 at ECCV, Copenhagen, Denmark, June 1 2002. • PETS2003 at ICVS, Graz; VS-PETS at ICCV2003 • PETS2004 at ECCV04 • WAMOP-PETS, CO, USA (Jan 05) as part of IEEE Winter Workshop Series • 2005: VS-PETS at ICCV’05 ETISEO, Nice, May 11-12 2005

  5. Datasets – Example – PETS2001 • Five separate sets of training and test sequences. • All datasets are multiview (frame sychronised). • Datasets were significantly more challenging than PETS2000 (significant lighting variation, occlusion, scene activity and use of multiview data) ETISEO, Nice, May 11-12 2005

  6. Datasets Dataset 1 Dataset 2 Dataset 3 ETISEO, Nice, May 11-12 2005

  7. Dataset 1 ETISEO, Nice, May 11-12 2005

  8. Dataset 2 ETISEO, Nice, May 11-12 2005

  9. Dataset 4 ETISEO, Nice, May 11-12 2005

  10. PETS - Prerequisites • Tracking results reported • should be performed using the test sequences, but the training sequences may optionally be used if the algorithms require it (for learning etc.) • may be based on a single camera view of the scene, or using multiple view data. • can be based on the entire test sequence, or a portion of it; the images may be converted to any other format and/or subsampled. • results must be submitted in XML format. ETISEO, Nice, May 11-12 2005

  11. PETS – Workshop Overview • XX contributed papers • ~3 sessions: e.g. appearance-based tracking, people and vehicle tracking, multiview tracking • Y invited speakers • Demonstration session • Overall evaluation and discussion ETISEO, Nice, May 11-12 2005

  12. Quantitative PE - XML • XML provides mechanism of setting up “syntax” file in form of schema • Schema used to automatically validate object tracking results • For PETS’2001, two schemas were used: • low-level tracking results • high-level surveillance (understanding object motions and interactions) ETISEO, Nice, May 11-12 2005

  13. Quantitative PE - XML <?xml version='1.0' encoding='ISO-8859-1' ?> <!-- --> <!-- Example file for visual surveillance reporting --> <!-- -> scene understanding with multiple cameras. --> <!-- Edited by PETS2001.Reading.JMF (J.M.Ferryman@reading.ac.uk) --> <people_tracker xmlns="http://www.cvg.cs.reading.ac.uk/PETS2001" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation= "http://www.cvg.cs.reading.ac.uk/PETS2001 http://www.cvg.cs.reading.ac.uk/PETS2001/XML/surveillance.xsd"> <!-- this is a comment ... add more as appropriate --> <header> <recording site="PETS2001 (Reading)" session="1" date="01/06/01"> <list_cameras num_cameras="2"> <camera camera_id="1"/> <camera camera_id="2"/> </list_cameras> </recording> ETISEO, Nice, May 11-12 2005

  14. Quantitative PE - XML <video start_frame="1" end_frame="1450" step="2" fps="25"/> <!-- step: stepping to read list of processed frames below --> <!-- (for (i=start; i<=end; i+=step); i++) --> <!-- fps: frames per second of original video --> <image xdim="768" ydim="576" colour="1"/> <!-- dimensions of video images and whether it is in colour (0,1) --> <software name="Reading People Tracker" platform="Linux" version="0.03" run_date="12/07/00"> <!-- information about the software this file originates from --> <object_detector name="Reading People Tracker" platform="Linux" version="0.03" run_date="08/06/01"/> </software> </header> ETISEO, Nice, May 11-12 2005

  15. Quantitative PE - XML <sequence> <!-- the actual data: a sequence of one or more frames. --> <frame id="2" num_targets="1"> <target id="8" start_frame="1" end_frame ="2"> <!-- a "target" is any object which moves or may move, usually a person, group of people, or a vehicle. The target's id is GLOBAL to all the cameras defined in "list_cameras" --> <!-- start_frame and end_frame indicate when the target has been tracked. end_frame may be unknown because it is in the future; in this case the longest known time where the object was tracked will be given --> <track status="4" location="0" speed="200" trajectory="1" t_confidence="0.5" num_parents="2"> <!-- track is part of a graph representing tracks of all targets --> <!-- the status of a graph node explains how the node of current target has been created or tracked. The following values may be used and added together as appropriate: 0 : default value, already tracked 1 : new track (id did not exist before) 2 : re-appearing object (id copied from last occurrence) 4 : merging (more than one parent in graph) 8 : splitting (at least one parent in graph has more than 1 child) 16 : lost (object NOT found in current image, given position etc are estimates (if available) or previous values) 32 : out of field of view (tracked object not "visible" as per definition (see elsewhere)) --> <!-- location values are defined as the sum of the following: 0 : undefined 1 : roadway 2 : in close proximity to vehicle (parking lot) 4 : on grass/verge 8 : other -->

  16. D1C1: XML output ETISEO, Nice, May 11-12 2005

  17. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  18. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  19. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  20. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  21. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  22. D1 C1 - 1 ETISEO, Nice, May 11-12 2005

  23. D1 C1 - 2 ETISEO, Nice, May 11-12 2005

  24. D1 C1 - 2 ETISEO, Nice, May 11-12 2005

  25. D1 C1 - 2 ETISEO, Nice, May 11-12 2005

  26. D1 C1 - 2 ETISEO, Nice, May 11-12 2005

  27. D1 C1 - 2 ETISEO, Nice, May 11-12 2005

  28. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  29. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  30. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  31. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  32. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  33. D1 C1- 3 ETISEO, Nice, May 11-12 2005

  34. D1C1: XML output 1 ETISEO, Nice, May 11-12 2005

  35. D1C1: XML output 2 ETISEO, Nice, May 11-12 2005

  36. D1C1: XML output 3 ETISEO, Nice, May 11-12 2005

  37. D1C1: XML output 4 ETISEO, Nice, May 11-12 2005

  38. D1C1: XML output 5 ETISEO, Nice, May 11-12 2005

  39. Performance Evaluation • Evaluation of surveillance system can be judged in a number of ways: • object detection lag • object centroid position error • object area error • track incompleteness factor • accuracy of semantics of interaction • object identity error • maintenance of identity through occlusion • … ETISEO, Nice, May 11-12 2005

  40. Image Format Processing Speed Processor ETISEO, Nice, May 11-12 2005

  41. Discussion • Evaluation criteria are application dependent • Training data – required or not? • representative examples • how much? • Semantics of XML schema • Ground truth • difficult to obtain • automatic evaluation - how? ETISEO, Nice, May 11-12 2005

  42. PETS Evaluation • +ve: “Mindset” – engaging the community –change of culture • +ve: Repository of data (PETS01 most frequently accessed) • +ve: Discussion/presentation of methodologies, metrics, tools … • +ve: Filters through to conferences/published literature • -ve: For workshop, choice of dataset(s) • + annotation • -ve: More quantitative evaluation needed ETISEO, Nice, May 11-12 2005

  43. PETS, ETISEO and the future … • Online web-based evaluation service • (Semi-)automatic validation of XML against ground truth • Repository of algorithms (incl. “strawman”), and tabulated results (rank?) • Methodology for evaluation • Metrics • More challenging datasets (e.g. multiview) • Live workshop sessions on “unseen” data • Expectation that ETISEO will support PETS ETISEO, Nice, May 11-12 2005

  44. PETS’05 • ICCV ’05, Beijing, China • 15-16 October 2005 • http://www.cbsr.ia.ac.cn/conferences/VS-PETS-2005 • http://visualsurveillance.org/PETS2005 ETISEO, Nice, May 11-12 2005

More Related