Program evaluation webinar series part 2 l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 159

Program Evaluation Webinar Series Part 2: PowerPoint PPT Presentation


  • 142 Views
  • Updated On :
  • Presentation posted in: General

Program Evaluation Webinar Series Part 2:. “Getting Started and Engaging Your Stakeholders”. Presented by: Leslie Fierro and Carlyn Orians. Getting Started and Engaging Your Stakeholders. Leslie A. Fierro , MPH TKCIS Contractor NCEH/CDC [email protected] Carlyn Orians , MA,

Related searches for Program Evaluation Webinar Series Part 2:

Download Presentation

Program Evaluation Webinar Series Part 2:

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Program Evaluation Webinar Series Part 2:

“Getting Started and

Engaging Your Stakeholders”

Presented by: Leslie Fierro and CarlynOrians


Getting Started and Engaging Your Stakeholders

  • Leslie A. Fierro, MPH

  • TKCIS Contractor NCEH/CDC

  • [email protected]

CarlynOrians, MA,

Battelle Centers for Public Health Research and Evaluation

[email protected]

Presented November 20, 2008


Agenda

  • Definition of evaluation.

  • Difference between evaluation & research.

  • CDC’s Evaluation Framework.

  • The who, why, when & how of stakeholders.

  • Tangible examples of engaging stakeholders.


What is the Purpose of Evaluation?


Evaluation: Definition 1

Evaluation is the systematic investigation of the merit, worth, or significance of an object.

-- Michael Scriven


Evaluation: Definition 2

Evaluation is the systematic assessment of the operation and/or the outcomes of a program or policy, compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy.

-- Carol Weiss


Evaluation: Definition 3

Evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.

-- Michael Patton


The Common Element

Note that all 3 of these statements share a common element. ..

…they all define evaluation as

a systematic and formalized endeavor.


Summative Evaluations

  • Summative evaluations seek to judge a program by asking, “Should this project be…”

    • cancelled?

    • continued?

    • expanded?


Formative Evaluations

  • Formative evaluations seek to use the evaluation findings.

  • They ask the question:

    • “Is the program being conducted as planned?”

    • “Is the program doing well?”

Formative evaluations seek to

improve programs or policies.


The CDC Definition

  • Evaluation is the systematic collection of information about the activities, characteristics and outcomes of the program to make judgments about the program, improve program effectiveness and/or inform decisions about future program development.


Research vs. Evaluation

Research and evaluation share methodologies

but ask different questions.

Specific to

Research

Specific to

Evaluation

Commonalities


The Research Context


Research Findings

Research is conducted to:

test and improve theories.

develop generalizable theories.

theories apply across different

settings, people, and times.


The Evaluation Context


Evaluation Findings

Evaluation results:

are not usually generalizable.

focus on a specific situation.

evaluation of a single program in a particular context.


Research vs. Evaluation

  • Evaluation asks:

    • “Is this program working?”

  • Research asks:

    • “Will this program work across multiple settings?”


The Role of Researchers

Researchers:

Play a single role as content experts.


The Role of Researchers

Researchers:

Play a single role as content experts.

Identify gaps in current knowledge.


The Role of Researchers

Researchers:

Play a single role as content experts.

Identify gaps in current knowledge.

Derive their own questions.


The Role of Researchers

Researchers:

Play a single role as content experts.

Identify gaps in current knowledge.

Derive their own questions.

Perform their own research.


The Role of Evaluators

Evaluators:

Play multiple roles—facilitator, educator, scientific expert, etc.


The Role of Evaluators

Evaluators:

Play multiple roles—facilitator, educator, scientific expert, etc.

Involve stakeholders.


The Role of Evaluators

Evaluators:

Play multiple roles—facilitator, educator, scientific expert, etc.

Involve stakeholders.

Collaborate to identify and prioritize questions.


“Research seeks to prove,

evaluation seeks to improve.”

M.Q. Patton

In a Nutshell…


The Findings Must be Useful

  • To improve a program, the findings must be useful!

  • How?

  • The stakeholders must be involved.

  • The questions must be relevant to the program.

  • The findings must be credible to key stakeholders.


Getting Started and Engaging Your Stakeholders

  • The Who, When, Why, and How

  • of Stakeholder Involvement

  • in CDC’s Evaluation Framework


“Who” are the Stakeholders?

  • Stakeholders are:

    • people and/or organizations that are

    • interested in the program, are

    • interested in the results of the evaluation and/or

    • have a stake in what will be done with the results of the evaluation.


Examples of Stakeholders


Each Program is Different

  • Develop a list of stakeholders at the start of any evaluation activity.


Which Stakeholders Matter Most?

  • Review your list of stakeholders and think strategically about these questions:

    • “Who do we need to…”


Which Stakeholders Matter Most?

  • Review your list of stakeholders and think strategically aboutthese questions:

    • “Who do we need to…”

    • enhance credibility?


Which Stakeholders Matter Most?

  • Review your list of stakeholders and think strategically about these questions:

    • “Who do we need to…”

    • enhance credibility?

    • implement program changes?


Which Stakeholders Matter Most?

  • Review your list of stakeholders and think strategically about these questions:

    • “Who do we need to…”

    • enhance credibility?

    • implement program changes?

    • advocate for changes?


Which Stakeholders Matter Most?

  • Review your list of stakeholders and think strategically about these questions:

    • “Who do we need to…”

    • enhance credibility?

    • implement program changes?

    • advocate for changes?

    • fund, authorize, or expand the program?


“Why” Engage Stakeholders?

  • Engaging stakeholders is an important part of the CDC Evaluation Framework.

    • Involving stakeholders may be a requirement of your program.


“Why” Engage Stakeholders?

  • Engaging stakeholders is an important part of the CDC Evaluation Framework

  • Stakeholders will add credibility.

    • If you want the results to be acted upon, they must be credible.


“Why” Engage Stakeholders?

  • Engaging stakeholders is an important part of the CDC Evaluation Framework.

  • Stakeholders will add credibility.

  • Stakeholders may have resources to help.

    • Stakeholders may be able to contribute data, skills, analytical skills, etc.


“Why” Engage Stakeholders?

  • Engaging stakeholders is an important part of the CDC Evaluation Framework.

  • Stakeholders will add credibility.

  • Stakeholders may have resources to help.

  • Stakeholders may be critical to implementing or advocating for action based on the results.


“Why” Engage Stakeholders?

  • Engaging stakeholders is an important part of the CDC Evaluation Framework.

  • Stakeholders will add credibility.

  • Stakeholders may have resources to help.

  • Stakeholders may be critical to implementing or advocating for action based on the results.

  • You will build trust and understanding among program constituents.

    • Involving stakeholders helps to reduce fear of the evaluation process.


“Why” Would Stakeholders Want to be Involved?

  • Get answers to their questions.


“Why” Would Stakeholders Want to be Involved?

  • Get answers to their questions.

  • Learn about evaluation.


“Why” Would Stakeholders Want to be Involved?

  • Get answers to their questions.

  • Learn about evaluation.

  • Influence the design and methods.


“Why” Would Stakeholders Want to be Involved?

  • Get answers to their questions.

  • Learn about evaluation.

  • Influence the design and methods.

  • Protect their constituents.


“Why” Would Stakeholders Want to be Involved?

  • Get answers to their questions.

  • Learn about evaluation.

  • Influence the design and methods.

  • Protect their constituents.

  • Motivated to help program succeed.


CDC’s Evaluation Framework

STEPS

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


“When” and “How” Can You Engage Stakeholders?

  • You can use the CDC Evaluation Framework to engage stakeholders in:

    • identifying and prioritizing evaluation questions,

    • selecting credible sources, and

    • developing a plan for action based on evaluation results.


“When” and “How” Can You Engage Stakeholders?

  • You can engage stakeholders in every step of the evaluation process.

  • But… you don’t have to engage stakeholders

  • in all these ways in every evaluation.


“How” Depends on…

  • Evaluator preference.


“How” Depends on…

  • Evaluator preference.

  • Stakeholder preference.


“How” Depends on…

  • Evaluator preference.

  • Stakeholder preference.

  • Resources.


“How” Depends on…

  • Evaluator preference.

  • Stakeholder preference.

  • Resources.

  • Degree of trust or threats to credibility.


“How” Depends on…

  • Evaluator preference.

  • Stakeholder preference.

  • Resources.

  • Degree of trust or threats to credibility.

  • If there is a high degree of mistrust, engage stakeholders early in the evaluation process. This helps ensure that the results are viewed as credible and are acted upon.


CDC’s Evaluation Framework

STEPS

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


Engage stakeholders

CDC’s Framework Step 1

  • Who should be involved?

    • Develop list of potential stakeholders.

    • Decide which stakeholders are the most important to include.

STEP 1

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


CDC’s Framework Step 2

  • Do stakeholders share a vision of what the program does and its intended outcomes?

    • Diverse views?

    • Similar views?

    • Engage them early in the process.

STEP 1

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


CDC’s Framework Step 3

  • What are the most pressing and important evaluation questions for stakeholders?

  • What questions do they need answered to be able to take action?

  • What methods are available and preferred?

  • What will be considered “credible evidence”?

STEP 1

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Focus the evaluation design

Justify conclusions

Gather credible evidence


CDC’s Framework Step 4

STEP 1

  • Gather evidence stakeholders will find credible.

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence

Gather credible evidence


CDC’s Framework Step 5

  • How do diverse stakeholders interpret the findings?

  • May engage stakeholders in the analysis.

  • Perhaps solicit their interpretation before results are finalized.

STEP 1

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Justify conclusions

Gather credible evidence


CDC’s Framework Step 6

  • Which stakeholders can play a role in disseminating results or acting on findings?

  • This is a critical step for stakeholder involvement.

STEP 1

Engage stakeholders

Ensure use and share lessons learned

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


Getting Started and Engaging Your Stakeholders

  • Tangible Examples of

  • Stakeholder Involvement

  • in the Evaluation Process


Example #1 – Asthma and Home Environment in Low-Income Apts

  • Intervention to improve indoor environment for children with asthma via following activities:


Example #1 – Asthma and Home Environment in Low-Income Apts

  • Intervention to improve indoor environment for children with asthma via following activities:

    • Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes.


Example #1 – Asthma and Home Environment in Low-Income Apts

  • Intervention to improve indoor environment for children with asthma via following activities:

    • Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes.

    • Work with city officials to enhance existing housing code.


Example #1 – Asthma and Home Environment in Low-Income Apts

  • Intervention to improve indoor environment for children with asthma via following activities:

    • Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes.

    • Work with city officials to enhance existing housing code.

    • Promote smoke-free housing.


CDC’s Evaluation Framework

STEPS

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 1: Engage the Stakeholders


Step 2: Describe the Program


Step 2: Describe the Program


Step 2: Describe the Program


Step 2: Describe the Program


Step 2: Describe the Program


Step 2: Describe the Program


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 3: Focus the Evaluation Design


Step 4: Gather Credible Evidence


Step 4: Gather Credible Evidence


Step 4: Gather Credible Evidence


Step 4: Gather Credible Evidence


Step 4: Gather Credible Evidence


Step 5: Justify Conclusions


Step 5: Justify Conclusions


Step 5: Justify Conclusions


Step 6: Ensure Use and Share Lessons Learned


Step 6: Ensure Use and Share Lessons Learned


Step 6: Ensure Use and Share Lessons Learned


Step 6: Ensure Use and Share Lessons Learned


What if you ignored stakeholders?

  • If you ignore the stakeholders, potential pitfalls include…


What if you ignored stakeholders?

  • If you ignore the stakeholders, potential pitfalls include…


What if you ignored stakeholders?

  • If you ignore the stakeholders, potential pitfalls include…


What if you ignored stakeholders?

  • If you ignore the stakeholders, potential pitfalls include…


What if you ignored stakeholders?

  • If you ignore the stakeholders, potential pitfalls include…


What if you ignored stakeholders?

If you fail to involve the stakeholders…

…your evaluation may not lead to action!


Example #2 – Care Coordination Across Health Systems

Intervention to provide and integrate care coordination and case management for high-risk children with asthma.

105


Example #2 – Care Coordination Across Health Systems

Intervention to provide and integrate care coordination and case management for high-risk children with asthma.

Involves standardizing protocols across care systems, including:

Medicaid HMOs

Home nursing agencies

Health departments

106


Example #2 – Care Coordination Across Health Systems

Intervention to provide and integrate care coordination and case management for high-risk children with asthma.

Involves standardizing protocols across care systems, including:

Medicaid HMOs

Home nursing agencies

Health departments

The goal: to prove success and thus convince

insurers to continue reimbursing case

management services.

107


Engage the Stakeholders


Engage the Stakeholders


Engage the Stakeholders


Engage the Stakeholders


Engage the Stakeholders


Focus the Evaluation Design


Focus the Evaluation Design


Focus the Evaluation Design


Gather Credible Evidence


Gather Credible Evidence


Gather Credible Evidence


Gather Credible Evidence


Gather Credible Evidence


Example #3 – Daycare Education

Intervention to train childcare providers to identify triggers and to manage children with asthma.


Example #3 – Daycare Education

Intervention to train childcare providers to identify triggers and to manage children with asthma.

Intervention reaches out to:

Large daycare centers

Licensed home daycare programs

Participation is encouraged through continuing education credits


Example #3 – Daycare Education

Intervention to train childcare providers to identify triggers and to manage children with asthma.

Intervention reaches out to:

Large daycare centers

Licensed home daycare programs

Participation is encouraged through continuing education credits

Long-term goal: to expand beyond pilot providers to larger community.


Engage the Stakeholders


Engage the Stakeholders


Focus the Evaluation Design


Focus the Evaluation Design


Focus the Evaluation Design


Focus the Evaluation Design


Gather Credible Evidence


Gather Credible Evidence


Getting Started and Engaging Your Stakeholders

  • Tangible Examples of

  • Stakeholder Involvement in

  • Applying the Evaluation Standards


Evaluation Standards Apply to Every Step

STEPS

Engage stakeholders

Ensure use and share lessons learned

Describe the program

Standards

Utility

Feasibility

Propriety

Accuracy

Focus the evaluation design

Justify conclusions

Gather credible evidence


The Evaluation Standards

  • Standards

    • Utility

    • Feasibility

    • Propriety

    • Accuracy


The Evaluation Standards

There are actually 30 evaluation standards grouped into four categories.

A complete list of the standards is published in CDC’s Framework for Program Evaluation in Public Health. (See link under “Learning Aids”.)


Why Use Standards?

  • Standards provide a way to:

  • Make difficult decisions when designing and implementing an evaluation.

  • Judge the quality of an evaluation.

  • Determine where an evaluation can be better balanced.


The “Utility” Standard


The “Feasibility” Standard


The “Propriety” Standard


The “Accuracy” Standard


CDC’s Asthma Control Program

  • CDC’s Asthma Control Program funds 35 states and territories to implement statewide asthma control programs using a public health perspective.

  • Their approach involves:

  • Engaging, enhancing, and maintaining relationships with partners.

  • Developing, improving, and conducting asthma surveillance activities.

  • Designing and implementing interventions with partners.


Example #4 - Asthma Program Monitoring System

  • This example involves collecting information for the purpose of evaluation across multiple sites.


Example #4 - Asthma Program Monitoring System

  • Purpose:

    Develop a strategic, systematic approach to collecting information about our program activities, progress, and accomplishments across all funded states.


Employing the Framework Model

  • Step 1: Involve the stakeholders

    • CDC Air Pollution and Respiratory Health Branch.

    • Representatives of funded State Asthma Programs.

  • Step 2: Describe the Program

  • Develop logic models.

  • Step 3: Focus the evaluation design

  • Identify and prioritize evaluation questions.

  • Step 4: Gather credible evidence

  • Create data collection instrument.


Developing the Data Collection Instrument

  • In this example, developing the data collection instrument was a highly collaborative endeavor aided by the program evaluation standards.


The Iterative Process

  • CDC internal workgroup identifies core information needs and drafts survey instrument.

  • State workgroup members review instrument, pilot test, provide pilot data and comments.

  • Create mock report to illustrate type of information obtained from instrument.

  • CDC internal workgroup discusses comments and mock report.

  • Modify instrument as necessary.


Applying the Standards

  • How were the evaluation standards employed in this example?

  • How was “engaging the stakeholders” important in applying each of the standards?


The “Utility” Standard


The Utility Standard in Action

  • Determined what information the CDC needed to do their job and how they will use the information.

  • Engaged states in discussions about how to make the information useful for them.

  • Developed mock reports to facilitate conversations with stakeholders about how the data could be used.


The “Feasibility” Standard


The Feasibility Standard in Action

  • Carefully selected “need-to-know” questions.

  • Asked states:

    • What information do you already have?

    • How feasible is it for you to obtain?

    • How long does it take to locate information?

    • How long does it take to fill out the data collection instrument?

  • Always kept the burden of data collection as low as possible.


The Feasibility Standard in Action

  • The content of the data collection instrument was a careful balance between feasibility and utility.

  • High priority data requiring higher burden was balanced by eliminating lower priority questions.


The “Propriety” Standard


The Propriety Standard in Action

  • Used a highly collaborative process.

  • Fostered open, honest dialogue about expectations and concerns.

  • Remained highly responsive to stakeholder concerns.

  • Modified instrument to address stakeholder concerns.


The “Accuracy” Standard


Balancing the Standards

When balancing feasibility vs. accuracy

consider the effect on propriety.


The Accuracy Standard in Action

  • Involved states in:

    • Pilot tests and reviews.

    • Discussions about “guessing”.

    • Analyzing the quality of the information provided.

    • Revising questions or collecting information that would help to improve accuracy.

    • Decisions to discard some questions.


What We Have Reviewed Today

  • The goal of evaluation and role of stakeholders.

  • Who to engage as stakeholders.

  • Why it is important to engage stakeholders.

  • When and how to engage stakeholders.

  • Stakeholder involvement across all steps of CDC framework.

  • Stakeholder involvement in fulfilling the evaluation standards.

  • Examples: Single and multiple sites


Thank You for Joining Us!

  • Leslie Fierro: [email protected]

  • Carlyn Orians: [email protected]


  • Login