Skip this Video
Download Presentation
Developing An Evaluation Plan For TB Control Programs

Loading in 2 Seconds...

play fullscreen
1 / 31

Developing An Evaluation Plan For TB Control Programs - PowerPoint PPT Presentation

  • Uploaded on

Developing An Evaluation Plan For TB Control Programs. Division of Tuberculosis Elimination National Center for HIV, STD, and TB Prevention Centers for Disease Control and Prevention. Developing An Evaluation Plan For TB Control Programs. Reference: A Guide to Developing an Evaluation Plan.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Developing An Evaluation Plan For TB Control Programs' - siobhan

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Developing An Evaluation Plan

For TB Control Programs

Division of Tuberculosis Elimination

National Center for HIV, STD, and TB Prevention

Centers for Disease Control and Prevention


Developing An Evaluation Plan

For TB Control Programs

Reference: A Guide to Developing an Evaluation Plan

why develop an evaluation plan
Why Develop an Evaluation Plan?
  • Provides a cohesive approach to conducting evaluation and using the results
  • Guides evaluation activities
    • Explains what, when, how, why, who
  • Documents the evaluation process for all stakeholders
  • Ensures implementation fidelity
guide to developing an evaluation plan
Guide to Developing An Evaluation Plan
  • Document referenced throughout presentation
  • Provides a template and instructions to help TB program staff develop an evaluation plan
  • Steps to evaluation are explained in detail
  • Completing sections and tables will result in an evaluation plan
the cdc program evaluation framework1
The CDC Program Evaluation Framework
  • Systematic method for evaluation
    • Based on research and experience
    • Flexible and adaptable
  • Promotes a participatory approach
  • Focuses on using evaluation findings
sections of an evaluation plan
Sections of an Evaluation Plan
  • Introduction
  • Stakeholder Assessment
    • Step 1: Engage Stakeholders
  • Background and Description of the TB Program and Program Logic Model
    • Step 2: Describe the Program
  • Focus of the Evaluation
    • Step 3: Focus the Evaluation Design
sections of an evaluation plan1
Sections of an Evaluation Plan
  • Gathering Credible Evidence: Data Collection
    • Step 4: Gather Credible Evidence
  • Justifying Conclusions: Analysis and Interpretation
    • Step 5: Justify Conclusions
  • Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination
    • Step 6: Ensure Use and Share Lessons Learned

An introduction provides background information, identifies the purpose of the evaluation, and provides a roadmap of the plan.

  • Evaluation Goal
    • What is the purpose of the evaluation?
  • Evaluation Team
    • Who is your evaluation coordinator?
    • Who are the members of your evaluation team?

Reference: Table 1 in the Evaluation Plan Guide

stakeholder assessment
Stakeholder Assessment

Stakeholders are individuals with vested interests in the success of the TB program. Involving stakeholders increases the credibility of the evaluation and ensures that findings are used as intended.

  • Who are the stakeholders in your TB program?
  • What are their interests in the evaluation?
  • What role do they play in the evaluation?
  • How do you plan to engage the stakeholders?

Reference: Table 2 in the Evaluation Plan Guide

background and description of the tb program
The program description ensures that stakeholders have a shared understanding of the program and identifies any unfounded assumptions and gaps.


What problem does your program address?

What are the causes and consequences of the problem?

What is the magnitude of the problem?

What changes or trends impact the problem?

Background and Description of the TB Program
background and description

What are environmental factors that affect your program?

Target Population

Does your program target the TB concerns of one population?

Program Objectives

What objectives have been set for your program?

Stage of Development

Is this a new initiative or is it well established?

Background and Description
background and description1

What resources are available to conduct the program activities?


What are program staff doing to accomplish program objectives?


What are the direct and immediate results of program activities (materials produced, services delivered, etc.)?


What are the intended effects of the program activities?

Reference: Table 3 in the Evaluation Plan Guide

Background and Description
program logic model
Program Logic Model

A logic model is a graphic depiction of the program description.

  • Arrows describe the links between resources, activities, outputs and outcomes
  • A logic model
    • Provides a sense of scope of your program
    • Ensures that systematic decisions are made about what is to be measured
    • Helps to identify and organize indicators

Contact Investigation

Goal: Prevent TB among contacts to cases (by finding and testing contacts for TB and LTBI, and then treating infected contacts to completion).




















Adequate infrastructure

i Qualified, trained and

and motivated staff

ii Community and congregate

setting partnerships

iii Policies, procedures, and


iv Ongoing data collection,

monitoring, and reporting


v Adequate physical,

diagnostic, and treatment


vi Linkages between jurisdictions

vii Adequate data collection tools

viii Partnership with private providers

Interview/reinterview cases

i Build rapport

ii Provide education

iii Obtain information about source case and


Cases identify contacts







Locate and evaluate contact:i Follow-up

ii Education

iii Examination & testing*

Contacts evaluated



Contacts followed up










Treat contact – case management


Contacts completeappropriate treatment for active TB or LTBI

Active TB cured in



TB (prevented) in

contacts with LTBI

Contacts start treatment




Comprehensiveinterview tool







Evidence-based decisionsabout continuation or termination of contact investigation


Improved approaches forcontact investigation



Reduced incidence and prevalence of TB

Staff trained in interviewtechniques




i Data collection

ii Data managementii Data analysisiv Data dissemination





Legal mandate to collectcontact information fromcongregate settings

TB eliminated



Conduct periodic reviewof cases/contacts and progresstoward contact treatment goals

focus of the evaluation
Focus of the Evaluation

Since you cannot feasibly evaluate everything, you must focus the evaluation by prioritizing and selecting evaluation questions.

  • Stakeholder Needs
    • Who will use the evaluation findings?
    • How will the findings be used?
    • What do stakeholders need to learn/know from the evaluation?
focus of the evaluation1
Focus of the Evaluation
  • Process Evaluation
    • What resources were required?
    • What program activities were accomplished?
    • Were they implemented as planned?
  • Outcome Evaluation
    • Is the program producing the intended outcomes?
    • Is there progress toward program objectives and goals?
focus of the evaluation2
Focus of the Evaluation
  • Evaluation Questions
    • Based on the needs of your stakeholders
    • Address process and outcome
  • Assess Your Questions
    • Feasible to collect
    • Provide accurate results
focus of the evaluation3
Focus of the Evaluation
  • Key Issues in Evaluation Design
    • Will you have a comparison or control group?
    • When will you collect data?
    • Will the data be collected retrospectively or prospectively?
    • What type of data do you need?
    • What data do you have already?
focus of the evaluation4
Focus of the Evaluation
  • Other Design Considerations
    • Standards for “good” evaluation
    • Timeliness
    • Stage of development
    • Data needed
  • Strengthen Your Design
    • Mix methods whenever possible
    • Use repeated measures
    • Triangulate
gathering credible evidence data collection
Gathering Credible Evidence: Data Collection

Identify indicators, standards, and data sources to address evaluation questions.

  • Indicators
    • Visible, measurable signs of program performance
    • Reflect program objectives, logic model and evaluation questions
  • Program Benchmarks and Targets
    • Reasonable expectations of program performance
    • Benchmarks against which to measure performance

Reference: Table 4 in your Evaluation Plan Guide


Gathering Credible Evidence: Data Collection

Linking evaluation questions, indicators and program benchmarks.

Example from the Guide – Table 4.

gathering credible evidence data collection1
Gathering Credible Evidence: Data Collection
  • Data Collection
    • Where are the data?
    • What methods will be used to collect data?
    • How often will the data be collected?
    • Who will collect the data?
  • Tools for Data Collection
    • Collect only the information you need
    • Easy to administer and use

Reference: Table 5 in your Evaluation Plan Guide


Gathering Credible Evidence: Data Collection

Linking indicators and data sources and specifying your data collection plan. Example from the Guide – Table 5.

gathering credible evidence data collection2
Gathering Credible Evidence: Data Collection
  • Human Subjects Considerations
  • Evaluation Timeline
    • Ensures that all stakeholders are aware of what activities are occurring at any time
    • Helps to determine if your evaluation resources will be strained by too many activities happening at once
  • Data Management and Storage
    • Ensures confidentiality and data quality

Reference: Table 6 in your Evaluation Plan Guide

justifying conclusions analysis and interpretation
Justifying Conclusions:Analysis and Interpretation

Once the data are collected, analysis and interpretation will help you understand what the findings mean for your program.

  • Analysis
    • What analysis techniques will you use for each data collection method?
    • Who is responsible for analysis?
  • Interpretation
    • What conclusions will you draw from your findings?
    • How will you involve stakeholders?

Reference: Table 7 in your Evaluation Plan Guide

ensuring use and sharing lessons learned reporting and dissemination
Ensuring Use and Sharing Lessons Learned: Reporting and Dissemination

A plan for dissemination and use of the evaluation findings will avoid having evaluation reports “sit on the shelf.”

  • Dissemination
    • What medium will you use to disseminate findings?
    • Who is responsible for dissemination?
  • Use
    • How, where, and when will findings be used?
    • Who will act on the findings?

Reference: Table 8 in your Evaluation Plan Guide

tips for evaluation planning
Tips for Evaluation Planning
  • Start small – focus on one initiative or program component to start with and limit the number of evaluation questions
  • Use what you already know about the program
  • Consider existing sources of data
  • Be realistic in your timeline and assessment of resources
  • Use the template and tables provided in the guide, adapt as needed
  • Seek help with your evaluation
evaluation resources
Evaluation Resources

Some Web-Based Resources

  • Centers for Disease Control and Prevention:
  • W.K. Kellogg Foundation:
  • University of Wisconsin Extension:

Selected Publications

  • Connell JP, Kubisch AC, Schorr LB, Weiss, CH. New Approaches to Evaluating Community Initiatives, New York, NY: Aspen Institute, 1995.
  • Patton MQ, Utilization-focused Evaluation, Thousand Oaks, CA: Sage Publications, 1997.
  • Rossi PH, Freeman HE, Lipsey MW. Evaluation: A Systematic Approach. Newbury Park, CA: Sage Publications, 1999.
  • Taylor-Powell E, Steele S, Douglas M. Planning a Program Evaluation. Madison, Wl: University of Wisconsin Cooperative Extension, 1996.