Evaluation of public health surveillance systems
Download
1 / 76

Evaluation of a Surveillance System - PowerPoint PPT Presentation


  • 820 Views
  • Updated On :

Evaluation of Public Health Surveillance Systems. CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009 Sam Groseclose, DVM, MPH Division of STD Prevention, NCHHSTP, CCID sgroseclose@cdc.gov Phone: 404-639-6494. Objectives.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation of a Surveillance System' - RoyLauris


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Evaluation of public health surveillance systems l.jpg

Evaluation of Public Health Surveillance Systems

CDC/CSTE Applied Epidemiology Fellowship Program Orientation 2009

Sam Groseclose, DVM, MPH

Division of STD Prevention, NCHHSTP, CCID

sgroseclose@cdc.gov

Phone: 404-639-6494


Objectives l.jpg
Objectives

  • Review steps in organizing & conducting surveillance system evaluation

  • Describe surveillance system attributes that should be assessed or measured

  • Describe how evaluation of surveillance system for outbreak detection differs from one for individual cases


Slide3 l.jpg

“Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

  • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)


Slide4 l.jpg

“Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

  • CDC. Updated guidelines for evaluating public health surveillance systems. MMWR 2001;50 (No. RR-13)


Why evaluate a surveillance system l.jpg
Why evaluate a surveillance system? collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

  • Are objectives being met?

  • Is outcome under surveillance still of public health importance?

  • Is monitoring efficient & effective?

  • Are objectives still relevant?


When to evaluate a surveillance system l.jpg
When to evaluate a surveillance system? collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

Response to changes in…

  • Priorities

  • Information needs

  • Epidemiology

  • Diagnostic procedures

  • Clinical practices

  • Data sources


Slide9 l.jpg

Rubella incidence – historic lows collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

50% rubella infections – asymptomatic

Is endemic transmission interrupted in U.S.?

Averhoff et al CID 2006


How would you assess adequacy of rubella surveillance l.jpg
How would you assess collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”adequacy of rubella surveillance?


Evaluation methods l.jpg
Evaluation methods: collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action to reduce morbidity and mortality and to improve health.”

  • Survey: state & local health department

    • Rubella & measles surveillance practices, e.g., # measles outbreak investigations

  • Survey: state & local public health labs

    • Lab testing practices

  • Enhanced evaluation: CA, NYC, US-Mexico Border ID Surveillance Project

  • Sentinel surveillance: HMO-based


Are measles or rubella investigations occurring is confirmatory lab testing being conducted l.jpg
Are measles or rubella investigations occurring? Is confirmatory lab testing being conducted?

Averhoff et al CID 2006


Which jurisdictions are conducting rubella investigations l.jpg
Which jurisdictions are conducting rubella investigations? confirmatory lab testing being conducted?

Averhoff et al CID 2006


Slide15 l.jpg

  • Conclusions: confirmatory lab testing being conducted?

  • No new cases found  Sufficient sensitivity of surveillance.

  • Rubella surveillance “rides the coattails”

  • of measles and other rash illness surveillance

  • enhancing sensitivity.

Averhoff et al CID 2006


Evaluation allows proactive response to new demands l.jpg
Evaluation allows proactive response to new demands. confirmatory lab testing being conducted?

  • New epidemiologic findings Revision of case definitions?

  • Data source allows monitoring of additional health-related events?

  • Need > timeliness or > efficiency?  Use of new information technology

  • Increasing access to e-data  Protection of patient privacy, data confidentiality, & system security

  • Other…


Cdc s updated guidelines for evaluating public health surveillance systems 2001 l.jpg
CDC’s confirmatory lab testing being conducted?Updated Guidelines for Evaluating Public Health Surveillance Systems, 2001


Cdc s updated guidelines for evaluating public health surveillance systems 200118 l.jpg
CDC’s confirmatory lab testing being conducted?Updated Guidelines for Evaluating Public Health Surveillance Systems, 2001

Based on:

  • CDC’s Framework for Program Evaluation in Public Health. MMWR 1999;48(RR-11) – under revision.

  • CDC’s Guidelines for evaluating surveillance systems. MMWR 1988;37(No. S-5)

  • Need for integrating surveillance & health information systems

  • Increasing relevance of informatics:

    • Establishing data standards

    • Electronically exchanging health data

  • Facilitating response to emerging health threats

Addressed:


Slide19 l.jpg

Examples of other guidance confirmatory lab testing being conducted?

on public health surveillance monitoring & evaluation


Tasks in cdc s updated guidelines l.jpg
Tasks in confirmatory lab testing being conducted?CDC’s updated guidelines

  • Engage stakeholders

  • Describe system

  • Focus evaluation design

  • Gather evidence of system’s performance

  • State conclusions & make recommendations

  • Ensure use of findings & share lessons learned


Task a engage stakeholders l.jpg
Task A. Engage stakeholders confirmatory lab testing being conducted?

  • Who are the system stakeholders?

  • Which ones should be involved?

  • Scope, level, & form of stakeholder involvement will vary

    • Influence design?

    • Provide data?

    • Aid interpretation?

    • Implement recommendations?


Stakeholder identification engagement l.jpg
Stakeholder identification & engagement confirmatory lab testing being conducted?

  • Ask your supervisor

  • Who is funding the system?

  • Who uses information derived from system?

  • Does political/organizational environment allow them to influence evaluation?

    How to engage?

  • Interview – develop questions ahead of time

  • Survey – more structured, more stakeholders, more relevant if they are ‘active’


Task b describe system l.jpg
Task B. Describe system confirmatory lab testing being conducted?

  • Public health importance

  • Purpose, objectives, & operation

    • Planned use of data

    • Case definition

    • Population under surveillance

    • Legal authority

    • System flow chart

      • Roles & responsibilities

      • Inputs & outputs

  • Resources


Public health importance should this event be under surveillance l.jpg
Public health importance: Should this event be under surveillance?

  • Indices of frequency or burden

    • Case count?

    • Incidence rate?

  • Summary measures of population health status

    • Disability-adjusted life-years?

  • Indices of severity

    • Case-fatality rate?

    • Hospitalization rate?

  • Disparities or inequities associated?

  • Preventability?


Public health importance information sources l.jpg
Public health importance: Information sources? surveillance?

  • Subject matter experts

  • Surveillance & research data

  • Literature review

  • Other…


Surveillance system purpose l.jpg
Surveillance system purpose surveillance?

Why does the system exist?

Example: To monitor “X health condition” in “Population under surveillance”


Surveillance system objectives l.jpg
Surveillance system objectives surveillance?

How are the data to be used for public health action?

  • Monitor burden or trends

  • Identify populations at increased risk

  • Support early detection

  • Inform risk management & decision-making

  • Evaluate interventions or policy

  • Other…


Example objectives for australian national notifiable disease surveillance system nndss l.jpg
Example: surveillance? Objectives for Australian National Notifiable Disease Surveillance System (NNDSS)

Miller et al. Comm Dis Intell, 2004


Example australian nndss processes l.jpg
Example: surveillance? Australian NNDSS processes

Miller et al. Commun Dis Intell, 2004


Example legislative authority australian nndss l.jpg
Example: surveillance?Legislative authority: Australian NNDSS

  • No legislative requirement for states and territories

  • to send notifiable disease data to the Commonwealth.

Miller et al. Commun Dis Intell, 2004


Resources l.jpg
Resources surveillance?

  • Direct costs

    • Person-time per year

    • IT hardware/software

    • Travel

    • Training

  • Indirect costs

    • Follow-up diagnostic lab testing

    • Case management

    • Outbreak response

  • Prevention benefits/costs from societal perspective

    • Cost of missing outbreaks

    • Productivity losses averted


Slide32 l.jpg

  • Example: surveillance?

  • Resources

  • Direct costs only

  • Cost by system phase

Kirkwood et al. J Public Hlth Mngmnt Practice, 2007


Task c focus evaluation design l.jpg
Task C. Focus evaluation design surveillance?

  • Specific purpose of the evaluation

    • CSTE fellowship only?

    • Public health objectives?

    • Response to health system reform?

  • Stakeholder’s input (Task A)

  • Identify questions that will be answered

  • How will information generated be used?

  • Can you define ‘relative’ performance standards’ metrics for attributes a priori?

    • What’s acceptable?


Task d gather evidence of system s performance l.jpg

Usefulness? surveillance?

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance


Task e state conclusions and make recommendations l.jpg
Task E. State conclusions and make recommendations surveillance?

  • Conclusions

    • Important public health problem?

    • System’s objectives met?

  • Recommendations

    • Modification/continuation?

      • Consider interdependencies between system costs & attributes

    • Ethical obligations

      • Surveillance being conducted responsibly?


Example evaluation conclusions l.jpg
Example surveillance?: Evaluation conclusions

Jhung et al. Medical care, 2007


Task f ensure use of findings and share lessons learned l.jpg
Task F. Ensure use of findings and share lessons learned surveillance?

  • Deliberate effort to use results & disseminate findings?

    • Prior discussion of response to potentially negative findings?

    • Prior plan to implement recommendations based on findings?

  • Strategies for communicating findings?

    • Tailor content & method to relevant audience(s)


Slide38 l.jpg

“The reason for collecting, analyzing and disseminating information on a disease is to control that disease. Collection and analysis should not be allowed to consume resources if action does not follow.”

Foege WH et al. Int J Epidemiology 1976

Similarly, evaluation findings should be applied for surveillance improvement.


Task d gather evidence of system s performance39 l.jpg

Usefulness? information on a disease is to control that disease.

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance


Example usefulness from public health system perspective l.jpg
Example: information on a disease is to control that disease. Usefulness from public health system perspective

Miller et al. Communicable Disease Intelligence, 2004


Example usefulness from external stakeholder perspective l.jpg
Example: information on a disease is to control that disease. Usefulness from external stakeholder perspective

Miller et al. Communicable Disease Intelligence, 2004


Slide42 l.jpg

Have your surveillance efforts information on a disease is to control that disease.

resulted in any of these outcomes?

WHO/CDS/CSR/LYO/2004.15


Task d gather evidence of system s performance43 l.jpg

Usefulness? information on a disease is to control that disease.

What actions taken based on data from system?

Meet system objectives?

System attributes

Simplicity

Flexibility

Data quality

Acceptability

Sensitivity

Predictive value positive (PVP/PPV)

Representativeness

Timeliness

Stability

Task D. Gather evidence of system’s performance


Timeliness l.jpg
Timeliness information on a disease is to control that disease.

  • Different scales based on outcome & action

    • Meningococcal meningitis >> cancer

  • If timeliness is critical:

    • Active surveillance

    • Acquire electronic records

    • Encourage telephone reports on suspicion

    • Educate clinicians and lab staff

    • Review as frequently as the data arrive

    • Remove barriers to prompt reporting

  • Adjust investment to importance


Slide45 l.jpg

When measuring timeliness, specify the information on a disease is to control that disease. types of dates used and the intervals measured.

Jajosky RA et al. BMC Public Health 2004


Slide46 l.jpg

Source information on a disease is to control that disease. : CDC. Framework for evaluating public health surveillance systems for early detection

of outbreaks: recommendations from the CDC Working Group. MMWR 2004; 53(No. RR-5).


Slide47 l.jpg
Assessment of timeliness of web-based notifiable disease reporting system by local health department

Conclusion: “relatively complete and timely”

Recommended future use of test result date (vs. collection)

Vogt et al. J Public Health Management Practice 2006


Sensitivity pvp of system l.jpg
Sensitivity & PVP of system reporting system by local health department


Sensitivity l.jpg
Sensitivity reporting system by local health department

  • Affected by:

    • Case detection process

    • Case reporting process

  • Sometimes referred to as completeness of reporting

    If reporting is ‘representative’ & consistent, surveillance system may perform well with moderate sensitivity


Sensitivity for individual cases l.jpg
Sensitivity for individual cases reporting system by local health department

  • High sensitivity means you miss few cases

  • To improve sensitivity:

    • Broaden case definition

    • Encourage reporting on suspicion

    • Active surveillance

    • Acquire electronic records

    • Audit sources for completeness

    • Remove barriers

  • Adjust investment to importance

  • Tradeoff with positive predictive value


Measuring completeness l.jpg
Measuring completeness reporting system by local health department

  • Uncorrected method:

    • # reported cases /total # cases id’d through active case detection & use of supplemental data sources (Gold standard?)

    • Example: Lab audit: XX% of total ‘eligible’ lab findings were reported

  • Under-ascertainment corrected method:

    • # reported cases/total # case estimated via capture-recapture methods


Assessing completeness lab reporting january 2000 june 2004 uncorrected l.jpg
Assessing Completeness: Lab Reporting reporting system by local health departmentJanuary 2000-June 2004 -- Uncorrected

33 Patients with Reactive CSF-VDRLs in Lab Audit

(5 Labs So Far)

29 with No CSF Result in Registry

4 with CSF Result in Registry

1 Another

Syphilis Stage Entered for Dx

3 NS Entered for Dx

Only 12% of Reactive

CSF-VDRLs were in the registry

Source:Neurosyphilis (NS) surveillance evaluation, NYC - Lindstrom


Assessing completeness provider reporting january 2000 december 2003 uncorrected l.jpg
Assessing Completeness: Provider Reporting January 2000-December 2003 – Uncorrected

SPARCS Data: New York State database of all hospital discharge diagnoses

111 Patients with primary discharge diagnosis of NS in NYC hospitals

Number with any discharge diagnosis of NS expected to be considerably higher

56 diagnosed NS cases in BSTDC registry January 2000-June 2004

Source:Neurosyphilis (NS) surveillance evaluation, NYC - Lindstrom


If use capture recapture consider l.jpg
If use capture-recapture, consider…. 2000-December 2003 –

  • Validity of data from each source

  • Dependency relationship between data sources

  • Criteria used for matching between data sources


Example capture recapture method comparing two surveillance systems findings 1 l.jpg
Example: 2000-December 2003 – Capture-recapture method comparing two surveillance systems’ findings(1)

?

?

Panackal et al EID 2002


Example capture recapture method comparing two surveillance systems findings 2 l.jpg
Example: 2000-December 2003 – Capture-recapture method comparing two surveillance systems’ findings (2)

Uncorrected: 65%

Uncorrected: 81%

Panackal et al EID 2002


Sensitivity for outbreaks l.jpg
Sensitivity for outbreaks 2000-December 2003 –

  • High sensitivity means you miss few outbreaks

  • To improve sensitivity, same as for individual cases, plus:

    • Syndromic surveillance

    • Set investigation threshold low

    • Set alarm threshold low

  • Affected by:

    • Magnitude of “signal” cases relative to “baseline” cases

    • Shape of outbreak signal – gradual vs. rapid increase

    • Outcome’s “incubation period”

  • Adjust investment to importance

  • Tradeoff with positive predictive value


Positive predictive value ppv for cases l.jpg
Positive Predictive Value (PPV) for cases 2000-December 2003 –

  • High PPV means most cases are good cases

  • To improve PPV:

    • Use narrower case definition

    • Encourage clinicians and labs to report only confirmed cases

    • Tight decision rules about electronic records to be submitted

    • Review cases before entering in system

  • Tradeoff with sensitivity and timeliness

  • Adjust investment to importance


Positive predictive value ppv for outbreaks l.jpg
Positive Predictive Value (PPV) for outbreaks 2000-December 2003 –

  • High PPV means few false alarms

  • To improve PPV, similar to cases, plus:

    • Set investigation threshold high

    • Set alarm threshold high

  • Tradeoff with sensitivity and timeliness

  • Missed outbreaks are expensive too


Negative predictive value npv for outbreaks l.jpg
Negative predictive value (NPV) for outbreaks 2000-December 2003 –

  • High NPV means you can rely on system to say outbreak is NOT present

  • Measures to increase sensitivity will usually increase NPV

  • Active surveillance means more credible negatives

  • System experience best indicator of actual NPV


Slide61 l.jpg

Biases in surveillance that result in decreased 2000-December 2003 – representativeness:

Population under surveillance

Case-patients

Non-cases

Case ascertainment bias

Reported

(true positive)

Not reported

(false negative)

Reported

(false positive)

Not reported

(true negative)

Information bias

(Data about the case)

Present

(correct)

Present

(incorrect)

Absent

Present

(correct)

Present

(incorrect)

Absent


Example case ascertainment bias l.jpg
Example: 2000-December 2003 – Case ascertainment bias

Miller et al. Commun Dis Intell 2004


Representativeness of sentinel surveillance gonococcal isolate surveillance project gisp l.jpg
Representativeness of sentinel surveillance? Gonococcal Isolate Surveillance Project (GISP)


Data quality l.jpg
Data quality Isolate Surveillance Project (GISP)

  • Assess data element completeness

  • Assess data validity:

    • Review sample of surveillance data?

    • Conduct record linkage?

    • Interview case patients? Chart review?

  • Influenced by:

    • Data collection formats

    • Training & supervision

    • Data management

  • Impacts surveillance system acceptability, representativeness, & usefulness


Slide65 l.jpg
Example: Isolate Surveillance Project (GISP)Data qualityCompleteness of data elementsValidity: Compared case reports to independent chart review

Jhung et al. Medical Care, 2007


Do you need full information on all cases l.jpg
Do you need full information on all cases? Isolate Surveillance Project (GISP)

  • Collecting all data on all cases may result in poor data quality

  • Alternative approach: Gather core demographic data on all cases, but full case report on only selected ones.

    • Unusual cases by age, sex, location, time of year?

    • Cases during suspected outbreak?

    • Random sample?


Stability l.jpg
Stability Isolate Surveillance Project (GISP)

  • Able to collect, manage, & provide data properly without failure?

  • Stable data source(s)?

  • Methods sustainable?

  • Able to operate system when needed?

    • % of time system fully operational? All times? All locations?

    • Desired & actual amount of time required for system to release data?


Slide68 l.jpg
Example: Isolate Surveillance Project (GISP)Simplicity- System structure- Ease of operationSingle data sourceTrained data abstractorMultiple agencies involved

Jhung et al. Medical Care, 2007


Flexibility l.jpg
Flexibility Isolate Surveillance Project (GISP)

  • Able to collect new information?

  • Able to modify case definition (e.g., aggregation of codes for syndromic surveillance) ?

  • Able to apply evolving standards (e.g., data, messaging) ?

  • Standalone system >> Integrated system


Privacy confidentiality l.jpg
Privacy & confidentiality Isolate Surveillance Project (GISP)

  • Defined roles & authorities for viewing & use of sensitive/personal information?

  • Controlled dissemination of sensitive/personal information?

    • Suppression of personal identifiers?

    • Systematic prevention of indirect identification of individuals by protocol & practice?

  • Defined data use and release policies?

  • Defined policy for response to inappropriate data release?


Slide71 l.jpg

Conceptual public health system framework in which public health

surveillance and action occur, indicating domains relevant for

surveillance system evaluation.

  • Public Health Action

  • Acute (epidemic-/disaster-type)

  • Case management

  • Partner/contact services

  • Outbreak investigation & management

  • Planned (management type)

  • Preparedness

  • Feedback & dissemination

  • Health education

  • Program management & development

  • Program planning & priority-setting

Public Health Surveillance Processes

Detection

Registration

Reporting

Confirmation/Investigation (Epi/Lab)

Data management & analysis

Interpretation

Dissemination & feedback

Surveillance Support Functions

Case definitions Standards, protocols, & guidelines

Monitoring & evaluation Leadership, management, supervision

Workforce Resources

Logistics Clinical & laboratory services

Training

Health

System

Public Health System Infrastructure

Legislation & Policy Organizational setting

Planning & strategy Stakeholders

Implementers Networks & partnerships

Community

Local, State, Federal Government

Adapted from McNabb SJN et al. Ann Epidemiol, 2004


Summary l.jpg
Summary health

  • Periodic evaluation required

  • Interplay between attributes

    • Relative importance of selected attributes for different surveillance systems & purposes

    • As you change one attribute, you impact others

    • With finite resources, emphasize different attributes based on surveillance objectives

    • Changing tolerance for ‘missing outbreaks’ or other performance characteristics

  • Guidelines themselves will evolve

  • Guidelines can (& should) be adapted


Challenges l.jpg
Challenges health

  • Regular, frequent monitoring of surveillance processes

  • Balance technology & people to optimize surveillance

  • Standards development with public health input

    • Electronic health record

  • Collaboration with health care sector

  • Coordinated national surveillance network in a federal system


Slide74 l.jpg

Thanks to Bob German, health

Richard Hopkins, & Heather Lindstrom

*********************

The findings and conclusions in this presentation are those of the author and do not necessarily represent the official position of the Centers for Disease Control and Prevention.


Slide75 l.jpg

Goal of outbreak detection health : identify signal rapidly with high accuracy when true distinction between baseline (noise) & outbreak (signal) cases is unknown.

Easier outbreak detection when incidence & variation of baseline cases is low relative to outbreak cases.


Slide76 l.jpg

What’s the conceptual relationship between the data source to an incident case of disease?

How does the sampling frame or frequency of data collection affect outbreak detection?


ad