evaluation of public health surveillance systems l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Evaluation of Public Health Surveillance Systems PowerPoint Presentation
Download Presentation
Evaluation of Public Health Surveillance Systems

Loading in 2 Seconds...

play fullscreen
1 / 76

Evaluation of Public Health Surveillance Systems - PowerPoint PPT Presentation


  • 900 Views
  • Uploaded on

Evaluation of Public Health Surveillance Systems. CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009 Sam Groseclose, DVM, MPH Division of STD Prevention, NCHHSTP, CCID sgroseclose@cdc.gov Phone: 404-639-6494. Objectives.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation of Public Health Surveillance Systems' - RoyLauris


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
evaluation of public health surveillance systems

Evaluation of Public Health Surveillance Systems

CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009

Sam Groseclose, DVM, MPH

Division of STD Prevention, NCHHSTP, CCID

sgroseclose@cdc.gov

Phone: 404-639-6494

objectives
Objectives
  • Review steps in organizing & conducting surveillance system evaluation
  • Describe surveillance system attributes that should be assessed or measured
  • Describe how evaluation of surveillance system for outbreak detection differs from one for individual cases
slide3
“Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”
  • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)
slide4
“Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”
  • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)
why evaluate a surveillance system
Why evaluate a surveillance system?
  • Are objectives being met?
  • Is outcome under surveillance still of public health importance?
  • Is monitoring efficient & effective?
  • Are objectives still relevant?
when to evaluate a surveillance system
When to evaluate a surveillance system?

Response to changes in…

  • Priorities
  • Information needs
  • Epidemiology
  • Diagnostic procedures
  • Clinical practices
  • Data sources
slide9

Rubella incidence – historic lows

50% rubella infections – asymptomatic

Is endemic transmission interrupted in U.S.?

Averhoff et al CID 2006

evaluation methods
Evaluation methods:
  • Survey: state & local health department
    • Rubella & measles surveillance practices, e.g., # measles outbreak investigations
  • Survey: state & local public health labs
    • Lab testing practices
  • Enhanced evaluation: CA, NYC, US-Mexico Border ID Surveillance Project
  • Sentinel surveillance: HMO-based
are measles or rubella investigations occurring is confirmatory lab testing being conducted
Are measles or rubella investigations occurring? Is confirmatory lab testing being conducted?

Averhoff et al CID 2006

slide15

Conclusions:

  • No new cases found  Sufficient sensitivity of surveillance.
  • Rubella surveillance “rides the coattails”
  • of measles and other rash illness surveillance
  • enhancing sensitivity.

Averhoff et al CID 2006

evaluation allows proactive response to new demands
Evaluation allows proactive response to new demands.
  • New epidemiologic findings Revision of case definitions?
  • Data source allows monitoring of additional health-related events?
  • Need > timeliness or > efficiency?  Use of new information technology
  • Increasing access to e-data  Protection of patient privacy, data confidentiality, & system security
  • Other…
cdc s updated guidelines for evaluating public health surveillance systems 200118
CDC’s Updated Guidelines for Evaluating Public Health Surveillance Systems, 2001

Based on:

  • CDC’s Framework for Program Evaluation in Public Health. MMWR 1999;48(RR-11) – under revision.
  • CDC’s Guidelines for evaluating surveillance systems. MMWR 1988;37(No. S-5)
  • Need for integrating surveillance & health information systems
  • Increasing relevance of informatics:
    • Establishing data standards
    • Electronically exchanging health data
  • Facilitating response to emerging health threats

Addressed:

slide19

Examples of other guidance

on public health surveillance monitoring & evaluation

tasks in cdc s updated guidelines
Tasks in CDC’s updated guidelines
  • Engage stakeholders
  • Describe system
  • Focus evaluation design
  • Gather evidence of system’s performance
  • State conclusions & make recommendations
  • Ensure use of findings & share lessons learned
task a engage stakeholders
Task A. Engage stakeholders
  • Who are the system stakeholders?
  • Which ones should be involved?
  • Scope, level, & form of stakeholder involvement will vary
    • Influence design?
    • Provide data?
    • Aid interpretation?
    • Implement recommendations?
stakeholder identification engagement
Stakeholder identification & engagement
  • Ask your supervisor
  • Who is funding the system?
  • Who uses information derived from system?
  • Does political/organizational environment allow them to influence evaluation?

How to engage?

  • Interview – develop questions ahead of time
  • Survey – more structured, more stakeholders, more relevant if they are ‘active’
task b describe system
Task B. Describe system
  • Public health importance
  • Purpose, objectives, & operation
    • Planned use of data
    • Case definition
    • Population under surveillance
    • Legal authority
    • System flow chart
      • Roles & responsibilities
      • Inputs & outputs
  • Resources
public health importance should this event be under surveillance
Public health importance: Should this event be under surveillance?
  • Indices of frequency or burden
    • Case count?
    • Incidence rate?
  • Summary measures of population health status
    • Disability-adjusted life-years?
  • Indices of severity
    • Case-fatality rate?
    • Hospitalization rate?
  • Disparities or inequities associated?
  • Preventability?
public health importance information sources
Public health importance: Information sources?
  • Subject matter experts
  • Surveillance & research data
  • Literature review
  • Other…
surveillance system purpose
Surveillance system purpose

Why does the system exist?

Example: To monitor “X health condition” in “Population under surveillance”

surveillance system objectives
Surveillance system objectives

How are the data to be used for public health action?

  • Monitor burden or trends
  • Identify populations at increased risk
  • Support early detection
  • Inform risk management & decision-making
  • Evaluate interventions or policy
  • Other…
example objectives for australian national notifiable disease surveillance system nndss
Example: Objectives for Australian National Notifiable Disease Surveillance System (NNDSS)

Miller et al. Comm Dis Intell, 2004

example australian nndss processes
Example: Australian NNDSS processes

Miller et al. Commun Dis Intell, 2004

example legislative authority australian nndss
Example:Legislative authority: Australian NNDSS
  • No legislative requirement for states and territories
  • to send notifiable disease data to the Commonwealth.

Miller et al. Commun Dis Intell, 2004

resources
Resources
  • Direct costs
    • Person-time per year
    • IT hardware/software
    • Travel
    • Training
  • Indirect costs
    • Follow-up diagnostic lab testing
    • Case management
    • Outbreak response
  • Prevention benefits/costs from societal perspective
    • Cost of missing outbreaks
    • Productivity losses averted
slide32

Example:

  • Resources
  • Direct costs only
  • Cost by system phase

Kirkwood et al. J Public Hlth Mngmnt Practice, 2007

task c focus evaluation design
Task C. Focus evaluation design
  • Specific purpose of the evaluation
    • CSTE fellowship only?
    • Public health objectives?
    • Response to health system reform?
  • Stakeholder’s input (Task A)
  • Identify questions that will be answered
  • How will information generated be used?
  • Can you define ‘relative’ performance standards’ metrics for attributes a priori?
    • What’s acceptable?
task d gather evidence of system s performance
Usefulness?

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance
task e state conclusions and make recommendations
Task E. State conclusions and make recommendations
  • Conclusions
    • Important public health problem?
    • System’s objectives met?
  • Recommendations
    • Modification/continuation?
      • Consider interdependencies between system costs & attributes
    • Ethical obligations
      • Surveillance being conducted responsibly?
example evaluation conclusions
Example: Evaluation conclusions

Jhung et al. Medical care, 2007

task f ensure use of findings and share lessons learned
Task F. Ensure use of findings and share lessons learned
  • Deliberate effort to use results & disseminate findings?
    • Prior discussion of response to potentially negative findings?
    • Prior plan to implement recommendations based on findings?
  • Strategies for communicating findings?
    • Tailor content & method to relevant audience(s)
slide38

“The reason for collecting, analyzing and disseminating information on a disease is to control that disease. Collection and analysis should not be allowed to consume resources if action does not follow.”

Foege WH et al. Int J Epidemiology 1976

Similarly, evaluation findings should be applied for surveillance improvement.

task d gather evidence of system s performance39
Usefulness?

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance
example usefulness from public health system perspective
Example: Usefulness from public health system perspective

Miller et al. Communicable Disease Intelligence, 2004

example usefulness from external stakeholder perspective
Example: Usefulness from external stakeholder perspective

Miller et al. Communicable Disease Intelligence, 2004

slide42

Have your surveillance efforts

resulted in any of these outcomes?

WHO/CDS/CSR/LYO/2004.15

task d gather evidence of system s performance43
Usefulness?

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance
timeliness
Timeliness
  • Different scales based on outcome & action
    • Meningococcal meningitis >> cancer
  • If timeliness is critical:
    • Active surveillance
    • Acquire electronic records
    • Encourage telephone reports on suspicion
    • Educate clinicians and lab staff
    • Review as frequently as the data arrive
    • Remove barriers to prompt reporting
  • Adjust investment to importance
slide45

When measuring timeliness, specify the types of dates used and the intervals measured.

Jajosky RA et al. BMC Public Health 2004

slide46

Source: CDC. Framework for evaluating public health surveillance systems for early detection

of outbreaks: recommendations from the CDC Working Group. MMWR 2004; 53(No. RR-5).

slide47
Assessment of timeliness of web-based notifiable disease reporting system by local health department

Conclusion: “relatively complete and timely”

Recommended future use of test result date (vs. collection)

Vogt et al. J Public Health Management Practice 2006

sensitivity
Sensitivity
  • Affected by:
    • Case detection process
    • Case reporting process
  • Sometimes referred to as completeness of reporting

If reporting is ‘representative’ & consistent, surveillance system may perform well with moderate sensitivity

sensitivity for individual cases
Sensitivity for individual cases
  • High sensitivity means you miss few cases
  • To improve sensitivity:
    • Broaden case definition
    • Encourage reporting on suspicion
    • Active surveillance
    • Acquire electronic records
    • Audit sources for completeness
    • Remove barriers
  • Adjust investment to importance
  • Tradeoff with positive predictive value
measuring completeness
Measuring completeness
  • Uncorrected method:
    • # reported cases /total # cases id’d through active case detection & use of supplemental data sources (Gold standard?)
    • Example: Lab audit: XX% of total ‘eligible’ lab findings were reported
  • Under-ascertainment corrected method:
    • # reported cases/total # case estimated via capture-recapture methods
assessing completeness lab reporting january 2000 june 2004 uncorrected
Assessing Completeness: Lab Reporting January 2000-June 2004 -- Uncorrected

33 Patients with Reactive CSF-VDRLs in Lab Audit

(5 Labs So Far)

29 with No CSF Result in Registry

4 with CSF Result in Registry

1 Another

Syphilis Stage Entered for Dx

3 NS Entered for Dx

Only 12% of Reactive

CSF-VDRLs were in the registry

Source:Neurosyphilis (NS) surveillance evaluation, NYC - Lindstrom

assessing completeness provider reporting january 2000 december 2003 uncorrected
Assessing Completeness: Provider Reporting January 2000-December 2003 – Uncorrected

SPARCS Data: New York State database of all hospital discharge diagnoses

111 Patients with primary discharge diagnosis of NS in NYC hospitals

Number with any discharge diagnosis of NS expected to be considerably higher

56 diagnosed NS cases in BSTDC registry January 2000-June 2004

Source:Neurosyphilis (NS) surveillance evaluation, NYC - Lindstrom

if use capture recapture consider
If use capture-recapture, consider….
  • Validity of data from each source
  • Dependency relationship between data sources
  • Criteria used for matching between data sources
example capture recapture method comparing two surveillance systems findings 1
Example:Capture-recapture method comparing two surveillance systems’ findings(1)

?

?

Panackal et al EID 2002

example capture recapture method comparing two surveillance systems findings 2
Example: Capture-recapture method comparing two surveillance systems’ findings (2)

Uncorrected: 65%

Uncorrected: 81%

Panackal et al EID 2002

sensitivity for outbreaks
Sensitivity for outbreaks
  • High sensitivity means you miss few outbreaks
  • To improve sensitivity, same as for individual cases, plus:
    • Syndromic surveillance
    • Set investigation threshold low
    • Set alarm threshold low
  • Affected by:
    • Magnitude of “signal” cases relative to “baseline” cases
    • Shape of outbreak signal – gradual vs. rapid increase
    • Outcome’s “incubation period”
  • Adjust investment to importance
  • Tradeoff with positive predictive value
positive predictive value ppv for cases
Positive Predictive Value (PPV) for cases
  • High PPV means most cases are good cases
  • To improve PPV:
    • Use narrower case definition
    • Encourage clinicians and labs to report only confirmed cases
    • Tight decision rules about electronic records to be submitted
    • Review cases before entering in system
  • Tradeoff with sensitivity and timeliness
  • Adjust investment to importance
positive predictive value ppv for outbreaks
Positive Predictive Value (PPV) for outbreaks
  • High PPV means few false alarms
  • To improve PPV, similar to cases, plus:
    • Set investigation threshold high
    • Set alarm threshold high
  • Tradeoff with sensitivity and timeliness
  • Missed outbreaks are expensive too
negative predictive value npv for outbreaks
Negative predictive value (NPV) for outbreaks
  • High NPV means you can rely on system to say outbreak is NOT present
  • Measures to increase sensitivity will usually increase NPV
  • Active surveillance means more credible negatives
  • System experience best indicator of actual NPV
slide61

Biases in surveillance that result in decreased representativeness:

Population under surveillance

Case-patients

Non-cases

Case ascertainment bias

Reported

(true positive)

Not reported

(false negative)

Reported

(false positive)

Not reported

(true negative)

Information bias

(Data about the case)

Present

(correct)

Present

(incorrect)

Absent

Present

(correct)

Present

(incorrect)

Absent

example case ascertainment bias
Example: Case ascertainment bias

Miller et al. Commun Dis Intell 2004

data quality
Data quality
  • Assess data element completeness
  • Assess data validity:
    • Review sample of surveillance data?
    • Conduct record linkage?
    • Interview case patients? Chart review?
  • Influenced by:
    • Data collection formats
    • Training & supervision
    • Data management
  • Impacts surveillance system acceptability, representativeness, & usefulness
slide65
Example:Data qualityCompleteness of data elementsValidity: Compared case reports to independent chart review

Jhung et al. Medical Care, 2007

do you need full information on all cases
Do you need full information on all cases?
  • Collecting all data on all cases may result in poor data quality
  • Alternative approach: Gather core demographic data on all cases, but full case report on only selected ones.
    • Unusual cases by age, sex, location, time of year?
    • Cases during suspected outbreak?
    • Random sample?
stability
Stability
  • Able to collect, manage, & provide data properly without failure?
  • Stable data source(s)?
  • Methods sustainable?
  • Able to operate system when needed?
    • % of time system fully operational? All times? All locations?
    • Desired & actual amount of time required for system to release data?
slide68
Example:Simplicity- System structure- Ease of operationSingle data sourceTrained data abstractorMultiple agencies involved

Jhung et al. Medical Care, 2007

flexibility
Flexibility
  • Able to collect new information?
  • Able to modify case definition (e.g., aggregation of codes for syndromic surveillance) ?
  • Able to apply evolving standards (e.g., data, messaging) ?
  • Standalone system >> Integrated system
privacy confidentiality
Privacy & confidentiality
  • Defined roles & authorities for viewing & use of sensitive/personal information?
  • Controlled dissemination of sensitive/personal information?
    • Suppression of personal identifiers?
    • Systematic prevention of indirect identification of individuals by protocol & practice?
  • Defined data use and release policies?
  • Defined policy for response to inappropriate data release?
slide71

Conceptual public health system framework in which public health

surveillance and action occur, indicating domains relevant for

surveillance system evaluation.

  • Public Health Action
  • Acute (epidemic-/disaster-type)
  • Case management
  • Partner/contact services
  • Outbreak investigation & management
  • Planned (management type)
  • Preparedness
  • Feedback & dissemination
  • Health education
  • Program management & development
  • Program planning & priority-setting

Public Health Surveillance Processes

Detection

Registration

Reporting

Confirmation/Investigation (Epi/Lab)

Data management & analysis

Interpretation

Dissemination & feedback

Surveillance Support Functions

Case definitions Standards, protocols, & guidelines

Monitoring & evaluation Leadership, management, supervision

Workforce Resources

Logistics Clinical & laboratory services

Training

Health

System

Public Health System Infrastructure

Legislation & Policy Organizational setting

Planning & strategy Stakeholders

Implementers Networks & partnerships

Community

Local, State, Federal Government

Adapted from McNabb SJN et al. Ann Epidemiol, 2004

summary
Summary
  • Periodic evaluation required
  • Interplay between attributes
    • Relative importance of selected attributes for different surveillance systems & purposes
    • As you change one attribute, you impact others
    • With finite resources, emphasize different attributes based on surveillance objectives
    • Changing tolerance for ‘missing outbreaks’ or other performance characteristics
  • Guidelines themselves will evolve
  • Guidelines can (& should) be adapted
challenges
Challenges
  • Regular, frequent monitoring of surveillance processes
  • Balance technology & people to optimize surveillance
  • Standards development with public health input
    • Electronic health record
  • Collaboration with health care sector
  • Coordinated national surveillance network in a federal system
slide74

Thanks to Bob German,

Richard Hopkins, & Heather Lindstrom

*********************

The findings and conclusions in this presentation are those of the author and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

slide75

Goal of outbreak detection: identify signal rapidly with high accuracy when true distinction between baseline (noise) & outbreak (signal) cases is unknown.

Easier outbreak detection when incidence & variation of baseline cases is low relative to outbreak cases.

slide76

What’s the conceptual relationship between the data source to an incident case of disease?

How does the sampling frame or frequency of data collection affect outbreak detection?