1 / 25

Evaluation of Surveillance Systems

Evaluation of Surveillance Systems . St Lukes -Roosevelt. Problems with our field. Programs often do more harm than good Programs don’t collect data, so no benefit shown The data we do collect is often not useful for improving program quality or guiding policy.

Thomas
Download Presentation

Evaluation of Surveillance Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Surveillance Systems St Lukes-Roosevelt

  2. Problems with our field • Programs often do more harm than good • Programs don’t collect data, so no benefit shown • The data we do collect is often not useful for improving program quality or guiding policy

  3. How do we show benefit, impact, change? • Surveillance • Ongoing • Surveys • One point in time

  4. Definition • Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality to improve health *CDC, Atlanta GA

  5. Key concept Ongoing Action

  6. Why evaluate a surveillance system? • Ensure that problems of public health importance are being monitored efficiently and effectively • Recommendations about the system should focus on improving quality, efficiency, and usefulness

  7. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability

  8. What should be evaluated? • System attributes: determine priorities • Simplicity: combine a practical structure with ease of use • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability

  9. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility: ability to adapt to changing information needs or operating conditions with minimal time, effort, cost • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability

  10. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality: completeness and validity • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability

  11. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability: willingness of persons or organizations to participate • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability

  12. Contraceptive prevalence rates in Afghanistan, WHO

  13. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity: ability to detect cases OR ability to detect outbreaks • Predictive value positive • Representativeness • Timeliness • Stability

  14. Incidence* of Shigella Dysentery Central Bosnia, 1991-1993 Region Prewar May-July 1993 Sarajevo City 0.3 4.0 (+1250%) Zenica City 0.3 4.4 (+1690%) Tuzla Region 0.5 0.4 (-10%) *Cases per 100,000 per month

  15. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive: proportion of persons identified as cases who truly are cases • Representativeness • Timeliness • Stability

  16. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness: system accurately describes events over time and space (time, person, place) • Timeliness • Stability

  17. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness: speed between steps; appropriateness in delays • Stability

  18. Epidemic curve, outbreak of mumps, Montreal

  19. Epidemic curve, cholera

  20. What should be evaluated? • System attributes: determine priorities • Simplicity • Flexibility • Data quality • Acceptability • Sensitivity • Predictive value positive • Representativeness • Timeliness • Stability: reliability and availability; resources

  21. Steps in evaluating a surveillance system • Stakeholder engagement • Describe the system: importance, purpose, resources • Focus the evaluation design • **Gather evidence regarding performance • Justify and state conclusions, make recommendations

  22. Malaria Surveillance • Purpose (CDC): (a) identify local transmission; (b) guide prevention recommendations for travelers • Additional benefits (JE) • Identify emerging species; treatment failures; local outbreaks • Historically • Tracking elimination • Case definition • Malaria cases confirmed by blood film, rapid diagnostic tests, PCR

  23. Malaria Surveillance • The system • **National Malaria Surveillance System • National Notifiable Disease Surveillance System (1878 cholera, smallpox, plague, yellow fever at overseas consules) • Direct CDC consultation

More Related