1 / 7

Alert Deployment 2012

Alert Deployment 2012. Rick Kjeldsen Cathee Cunningham. Overview of D eployment P rocess. Camera Selection NYPD selects cameras and defines alert use cases IBM reviews and recommends changes Field Test Alert is configured with default tuning

creola
Download Presentation

Alert Deployment 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Alert Deployment 2012 Rick Kjeldsen Cathee Cunningham

  2. Overview of Deployment Process • Camera Selection • NYPD selects cameras and defines alert use cases • IBM reviews and recommends changes • Field Test • Alert is configured with default tuning • Test mode, i.e., alert does not route to dashboard • Adjudicate alerts to Determine False Positive Rate • Stage drops to estimate Hit Rate • Tune as required • Deploy • Send alerts to dashboard • Continuing performance monitoring

  3. Basic questions • Number of alerting cameras ? • Deployment timeframe / schedule ? • Alert mix • Primarily Abandoned Bag ?

  4. Camera Selection details • NYPD proposes cameras and alert use cases • Rank importance • IBM reviews proposed cameras and requested alerts • Identify potential problems based on camera view and requested alert • Initially expert evaluation – later formalized process • Review w/ NYPD and adjust list as required • Cameras to test in field: • Good performance expected • Marginal performance, but important

  5. Formal camera analysis • Formalize process to estimate performance • Goal: Allow camera evaluation by people with basic training • Overview • Rank camera on criteria such as Angle, Distance, Activity… • Rankings combined to score which estimates performance • Current status… • Conclusions: • Results suggest some attributes are predictive • Loitering and high activity correlate with high FPR • Insufficient data to evaluate other relationships • Need more cameras in study

  6. Field Test details • Alert is configured with default tuning • Test mode, i.e., alert does not route to dashboard • Initial configuration guided by camera attribute analysis • Estimate False Positive Rate • IBM adjudicates alerts, approximating NYPD process • Stage drops to estimate Hit Rate • Initially every camera • Practical for large numbers of cameras ? • Only high importance cameras ? • Goal ? • Tune as required • Tuning effort depending on camera importance rank • Deployment criteria ?

  7. Deployment details • Send alerts to dashboard • Adjudication taken over by NYPD • Performance monitoring • Continuous FPR monitoring • Recurrent hit rate monitoring ? • Each release • Recorded staged drops • When conditions change ? • New staged drops

More Related