1 / 95

Module 1 Overview of Evaluation

Module 1 Overview of Evaluation. NBCCEDP Enhancing Program Performance through Evaluation. Module 1: Overview of Evaluation. Welcome to Module 1: Overview of Evaluation. This module consists of the following sections:

dayton
Download Presentation

Module 1 Overview of Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Module 1Overview of Evaluation NBCCEDPEnhancing Program Performance through Evaluation

  2. Module 1: Overview of Evaluation Welcome to Module 1: Overview of Evaluation. This module consists of the following sections: Choose from the menu above to go to a specific section or choose the next button below to go view the objectives, overview, and organization of this module. Objectives Description of Evaluation Types of Evaluation Program Outcomes CDC Framework for Program Evaluation Planning an Evaluation My Evaluation Plan References Module 1 Quiz

  3. Module 1 Objectives At the conclusion of Module 1 you should be able to: • Identify and describe the six steps in the Centers for Disease Control and Prevention (CDC) Evaluation Framework • Explain the benefits of program evaluation to your program

  4. Module 1 Overview This module will provide an introduction to evaluation and describe the importance of evaluation to BCCEDPs. It will describe a process to easily conduct a program evaluation, the CDC Program Evaluation Framework, and offer examples for evaluating components of BCCEDPs.

  5. Organization of Module 1 In this module, you visit the following sections: At the end of the module, you can take a short quiz to assess the knowledge you have gained about evaluation. • Description of evaluation • Presentation of the CDC Evaluation Framework and an evaluation case study • Development of your own evaluation plan

  6. Definition of Program Evaluation Evaluation is one of the essential functions of the NBCCEDP that serves to support all of the activities of the main components of the program: • Recruitment • Screening and Diagnostics Services • Data Management • Professional Development • Partnerships • Program Management • Quality Assurance and Improvement

  7. Definition of Program Evaluation Evaluation is defined as the systematic documentation of the operations and outcomes of a program, compared to a set of explicit standards, objectives, or expectations.1 • Systematic implies that evaluation is carefully planned and implemented to ensure that its results are credible, useful and are used. • Information represents all the evaluation data that are collected about the program to help make the “judgments” or decisions about program activities. • Activities and outcomes identify what BCCEDPs do, actions of the program, and effects of the program. It is required that programs have an evaluation plan for all essential program components in order to determine if the componentsare reaching the desired outcomes.

  8. Why is Evaluation Important? The purpose of program evaluation is to assess the program implementation (processes), outcomes (effectiveness) and costs (efficiency). It gathers useful information to aid in planning, decision-making, and improvement. Evaluation aims to better serve BCCEDP participants, program partners, and your own program by : • maximizing your ability to improve the health status of women in your community through the program’s activities. • demonstrating accountability and sound management of resources.

  9. Evaluation tells you:Is our program working? Evaluating BCCEDP components helps you to: • Monitor compliance with CDC guidance. • Identify what your BCCEDP has done. • Learn about your program’s strengths and successes. • Identify program needs and weaknesses. • Identify effective and ineffective activities. • Improve the quality, effectiveness, and efficiencyof your program. • Improve the program operations and outcomes. • Recognize gaps in the overall program. • Demonstrate program effectiveness to stakeholders. • Use findings for program planning, monitoring, and decision making. In short, evaluation helps you explain that you are achieving outcomes. It helps your communities, funders, and partners see that your BCCEDP get results. Best of all, it helps you demonstrate that aspects of your program work.

  10. Why is Evaluation Important? Program Reflection Now, take a moment to think about why evaluation is important to your program. Consider 2-3 reasons why evaluation might be critical. Click here to go to a Program Reflections Word document to work through as you go through this module.

  11. Types of Evaluation There are three main types of evaluation that you can conduct: 1) Process (evaluation of activities) 2) Impact (evaluation of direct effects of the program) 3) Outcome (evaluation of longer lasting benefits of the program) Ideally, any program will conduct a combination of all types of evaluation in order to get a well-rounded picture of what is actually going on. The use of a specific type of evaluation will depend on the purpose of the evaluation, your program’s stage of development, and available resources. More established activities will have less need for basic process evaluation and may focus more on impact and outcome evaluation. The following section describes the three main types of evaluation. Click here for a summary of the Types of Evaluation.

  12. Types of Evaluation: Process Evaluation Process evaluation A process evaluation focuses on: • how a program works to attain specific goals and objectives • how well implementation is going, and • what barriers to implementation exist. Process evaluation is known as the “counting” or “documenting” of activities and should answer the question, are we doing what we said we would do? It can also help identify problems with program implementation earlier. Process evaluation is most appropriate when your program is already being implemented or maintained, and you want to measure how well the program process is being conducted. These process data often provide insight into why outcomes are not reached. • Sample process evaluation questions include: • How many women were screened by the time of your progress report or annual report? Where were the program-sponsored professional development events held in the last year compared to the location of your clinics? • How many women at clinic A were advised by a provider to be screened? • Are providers satisfied with the program?

  13. Types of Evaluation: Impact Evaluation Impact Evaluation An impact evaluation is implemented in order to determine what direct effects the program actually has on those directly and indirectly experiencing it, not only participants, but also program partners and the community. The impact evaluation also provides information about whether the program has been able to meet its short-term goals and objectives. Impacts on the program often focus on changes in awareness of breast cancer and need for screening, attitudes toward preventive screening, and screening behaviors on the part of the participants. • Sample impact evaluation questions include: • To what extent has your program met its screening targets? • To what extent has your program delivered appropriate and timely screening and diagnostic services? • Have your partnership activities helped to increase screening among priority populations? • To what extent has your program employed evidence-based strategies for recruitment and guidelines for screening of women?

  14. Types of Evaluation: Outcome Evaluation Outcome evaluation An outcome evaluation is implemented in order to discern whether a program has been able to meet its long-term goals and objectives. Outcome evaluation can track the maintenance of program effects (i.e., screening over time). It also documents longer term effects on morbidity and cancer mortality. • Sample outcome evaluation questions include: • Have the number of mammograms and Pap smears provided increased over time? • Has the program maintained enrollment of women in priority populations over time? • Has the program continued to detect breast and cervical cancers in earlier stages?

  15. Review of Types of Evaluation

  16. Using Program Outcomes in Evaluation Process For the BCCEDP, there are major outcome measures for every program component. It is important to keep your program’s outcome measures in mind throughout the evaluation process, including the development stage of your plan, during data collection and especially when interpreting your findings. Your program activities lead to outcomesthat help reach NBCCEDP goals. If you want to assess whether outcomes were met, your evaluation will focus on how they were achieved. From the evaluation results, you may learn that activities need to be revisited, revised, or added to meet program goals and objectives. Evaluation of a Program Component NBCCEDP Component Program Activities Program Outcome Measures Short-term Goals Intermediate Goals Long-term Goals

  17. Program Outcome Measures The boxes below provide examples of outcomes for each program component. You may have already incorporated these within your evaluation or you may want to consider how these outcomes may be helpful to you. Outcomes can be process-oriented or health-centered, focusing on priority populations or knowledge, attitudes, or skills development. Click on the program component below to review examples of outcomes for that component.Click here for a summary of suggested Program Outcome Measures.

  18. Program Management Outcomes • Quality and characteristics of annual workplan. • Correspondence of program’s budget to its workplan. • Allocation of resources to implement program components. • Staff’s comprehension of NBCCEDP components. • Use of program data for program planning and decision-making.

  19. Screening & Diagnostic Services Outcomes • Access to program services for eligible women. • Access to cervical cancer screening, diagnostic and treatment services. • Access to breast cancer screening, diagnostic and treatment services. • Provision of services according to clinical guidelines approved by the grantee medical advisory board or consultants. • Use of current data for effective case management.

  20. Data Management Outcomes • Existence of data systems to collect, edit, manage, and continuously improve tracking of services provided. • Reduction or elimination of program data errors. • Existence of mechanisms for reviewing and assessing data quality.

  21. Quality Assurance & Quality Improvement Outcomes • Providers’ use of current standards, accepted clinical guidelines, and program policies as assessed by program staff. • Existence and maintenance of a continuous quality improvement committee. • Monitoring, assessing, and improving clinical services to meet or exceed CDC performance benchmarks for quality. • Program-eligible women’s satisfaction with program services provided. • Program providers’ satisfaction with the program.

  22. Evaluation Outcomes • Quality of evaluation plans for each program component. • Availability and quality of program data. • Conduct of evaluation activities in order to assess program effectiveness. • Use of evaluation results to inform program decision-making.

  23. Recruitment Outcomes • Implementation of evidence-based strategies to increase recruitment. • Recruitment of program-eligible women in priority populations. • Conduct of program activities to increase awareness about breast and cervical cancer. • Program-eligible women’s attitudes toward screening. • General public’s knowledge about the need for breast and cervical cancer screening.

  24. Partnerships Outcomes • Use of partnershipsto recruit and retain providers. • Use of partnerships to educate and increase awareness of breast and cervical cancer. • Use of partnerships to promote and facilitate breast and cervical cancer screening. • Use of partnerships to promote professional development activities. • Engagement of community partners in activities that promote breast and cervical cancer screening services for NBCCEDP priority populations.

  25. Professional Development Outcomes • Development of partnerships with academic and professional organizations in order to build resources for professional development activities. • Local BCCEDP staff’s knowledge about breast and cervical cancer screening, diagnosis, and treatment. • Adoption of evidence-based clinical practices by providers to improve services. • Assessment of provider needs for improving screening, diagnosis, and treatment. • Integration of cultural sensitivity into professional development activities.

  26. More Information on Evaluation If you would like more information about evaluation, you can review the Evaluation Chapter of the NBCCEDP Program Guidance Manual.* Click here to link to the Evaluation Chapter. WORD | PDF *This chapter was made available as of April 2007. Please check with your program consultant as to potential updates to the chapter.

  27. Getting Expert Assistance with Evaluation As you are planning for the evaluation, you should consider if you need a person with expertise in planning and carrying out the evaluation. Some of the types of people who may be helpful are: People from Cancer registries Surveillance staff Data management staff Program evaluators Economists for cost studies. You may find them within your own organization or within collaborating organizations. These experts may be particularly helpful to your evaluation team.

  28. CDC Framework for Evaluation In the next section, we will review the CDC Framework for Evaluation. It is a process that you can use to conduct an evaluation project. To learn more about the Framework, click CDC Framework for Evaluation.

  29. CDC Framework for Evaluation The CDC Framework for Program Evaluation in Public Healthis the recommended process for conducting evaluations. It outlines six steps for program evaluation. Click here to see a reference handout on the Framework for Evaluation. The following section will use the Framework for Program Evaluation as a guide to the evaluation process for breast and cervical cancer early detection programs. These steps will be described as they relate to BCCEDP evaluation. Modules 2 and 3 will walk you through the process of conducting an evaluation of recruitment and partnership development strategies.

  30. Goals of the CDC Framework The CDC Framework for Evaluation is a process to help you answer these questions: 1. Who is the evaluation for? 2. What program are we evaluating? 3. What methods will we use in conducting our evaluation? 4. How will we gather and analyze information that is credible and in what forms? 5. How will we justify our conclusions? 6. How can we be assured that what we learn will be used?

  31. Ensuring a Quality Evaluation The CDC Framework also has a part that addresses the quality of an evaluation. To answer how good an evaluation is, the Framework suggests that you consider the following standards for evaluation:

  32. CDC Framework for Evaluation and Evaluation Case Study As we introduce the CDC Framework for evaluation, you will also be presented with an example for each step. The example will focus on a screening evaluation question. Now, you will be taken through each step of the CDC Evaluation Framework.

  33. The CDC Framework for Evaluation Step 1: Engage Stakeholders The first step in the program evaluation process is to engage stakeholders. Stakeholders are those persons or organizations that have an investment in what will be learned from an evaluation and what will be done with the knowledge. It is important to involve stakeholders in the planning and implementation stages of evaluation to ensure that their perspectives are understood and that the evaluation reflects their areas of interest. Involving stakeholders increases awareness of different perspectives, integrates knowledge of diverse groups, increases the likelihood of utilization of findings, and reduces suspicion and concerns related to evaluation.

  34. Here are different types of BCCEDP stakeholders who are important to engage in all parts of the evaluation. BCCEDP stakeholders can fall into different categories: Decision-makers Implementers Program Partners Participants The table provides examples of each category of stakeholders. The CDC Framework for Evaluation Step 1: Engage Stakeholders

  35. The CDC Framework for Evaluation Step 1: Engage Stakeholders Program ReflectionIdentify potential stakeholders who are important to your program. Identify specific people or groups of people who will use the evaluation findings to make decisions about your program. Click here to go to a Program Reflections Word document to work through as you go through this module.

  36. The CDC Framework for Evaluation Step 1: Engage Stakeholders Some of the roles of stakeholders may include: • Serving on an advisory committee for the evaluation. • Prioritizing aspects of the program to evaluate. • Developing questions or surveys. • Providing resources (e.g., American Cancer Society, Centers for Disease Control and Prevention). • Offering data sources (e.g., state cancer registry, National Cancer Institute, Centers for Disease Control and Prevention). • Analyzing data. • Communicating evaluation results.

  37. Engage Stakeholders Case Study You are interested in assessing how your program is doing with its screening activities. For your BCCEDP, screening activities include: 1) women with abnormal screening test results receive timely diagnostic examinations, and 2) women will receive a final diagnosis after an abnormal screening result. Stakeholders who may be interested in this question are: • Program director, screening coordinator, case management coordinator, or quality assurance coordinator • Provider networks that screen and provide services for the women • CDC

  38. An Evaluation Workplan An evaluation plan is a program management tool that provides direction and guidance for the overall evaluation as well as each evaluation component. It is designed to be used for evaluation planning, implementation, and monitoring progress. Designing an evaluation plan is intended to make the job of managing your program more efficient and effective. Here is an example of a simple evaluation plan that takes you from your questions to how to use the data. Click here for a Word version of the Evaluation Plan.

  39. Completing the Evaluation Workplan We will be using the Evaluation Workplan to help you prepare for an evaluation through the course of this training. When you are completing the evaluation workplan, you should enter the following information related to the CDC Framework for Program Evaluation in each column. Step 2: Describe the Program is not included in the plan. Your activities are probably already described in your work plan. You can also put more information about the activities to be evaluated under the column Evaluation Question.

  40. Engage Stakeholders Case Study Now that you have identified stakeholders, here is how your evaluation plan would appear.

  41. The CDC Framework for EvaluationStep 2: Describe the Program The second step in the program evaluation process is to describe the program. Before you can develop an overall evaluation plan, it is important to have a description of, and the context of, your program. Without an agreed-upon program definition and purpose, it will be difficult to focus the evaluation efforts and results will be of limited use. Important aspects of the program description are: 1) need for the program, 2) resources, 3) component activities, and 4) expected outcomes. This information is often found in your workplan. Therefore, it is valuable to review the annual workplan with the objectives for each component area.

  42. The CDC Framework for EvaluationStep 2: Describe the Program In order to understand how all of these aspects of the program work together it is valuable to develop a picture of your program. This can be done in a number of formats including a flow chart. The program flow chart shows the larger picture of the program including the relationships between individual activities and the expected results. A program flowchart describes the program: resourcesor inputs, activities, productsor outputs, and outcomes. It is a graphic representation of all aspects of the program and how they work together to achieve the program’s long-term outcomes. Some people call this picture a logic model.

  43. NBCCEDP Flowchart Inputs Activities This flowchart presents the relationships between your program component activities and their potential outcomes. By clicking on each component under the Activities, you will be able to view examples that can help you begin to think about how your program corresponds to the flow chart. Click here to review a printable version of the NBCCEDP flowchart. Federal programs and NBCCEDP staff Recruitment Screening & Diagnostics Services Professional Development Quality Assurance Partnerships Data Management Evaluation Management Grantee breast and cervical cancer program State and community partners Program participants and public Workplan Workplan

  44. Long-tem Outcomes Outputs Short term Outcomes Intermediate Outcomes • Program services accessible to eligible women. • Access to cervical cancer screening and diagnostic and treatment services. • Access to breast cancer screening and diagnostic and treatment services. • Recruitment of program eligible women priority populations. • Cervical cancer screenings among program eligible women, with an emphasis on rarely/never screened. A reduction in breast and cervical cancer related morbidity and mortality. • Breast cancer screening among program eligible women, with an emphasis on those aged 50 – 64. • Evidence-based strategies implemented to increase recruitment. • Services provided according to clinical guidelines approved by the medical advisory committee. • Number of women diagnosed in the program who have an early stage of the disease. • Eligible women re-screened at appropriate intervals. • Timely and adequate service delivery and case management to women with abnormal screening results (or diagnosis of cancer). • Treatment service delivery for women with cancer. • Case management provided to women with abnormal screening results. • Program eligible women’s and public’s awareness, attitude and knowledge of screening. • Use of current data for effective case management. • Improvement of rates of breast and cervical cancer rescreening per clinical guidelines. • Providers are knowledgeable about breast and cervical cancer screening. • Evidence-based clinical guidelines adopted by providers to improve services. • Assessment of needs of providers. • Current standards, guidelines, and policies assessed by the medical advisory consultants. • Management • Staff hired • Number of staff meetings • Evaluation • Existence of an evaluation workplan • Breast and cervical cancer early detection issues are addressed by sustained and effective partnerships. • Evidence-based practices used in service delivery. • Use of program data for program planning and decision making. • QA data provided for the clinical practice QA program. • Collaborations are used to recruit providers and conduct professional development activities. • Sustainability and effectiveness of partnerships. • Reduction or elimination of program data errors.

  45. The CDC Framework for EvaluationStep 3: Focus the Evaluation Once the key players are involved, and there is a clear understanding of the program, it is important to focus the evaluation design and determine the purpose, users, uses, evaluation questions, and develop an evaluation plan. Focusing the evaluation design is important because it helps identify useful questions that you want to ask about your program. Click here for a handout of Examples of Evaluation Questions for BCCEDPs.

  46. Due to the large size, complexity, and finances of the program, it is not possible to evaluate every aspect of the program. You should first examine the DQIGs that provide valuable monitoring data to help identify problem areas for further investigation and the program performance as noted in progress reports. Then, focus evaluation efforts on the areas of the program that are not working optimally. It is, however, important to look at the program as a whole and think about how you would evaluate each aspect of the program. You should first begin addressing: Any new initiative with resources allocated to it. Any activity that consumes a high amount of resources. Activities that are not successful at meeting their measures of success. Program inconsistencies to explore why they exist. Any unevaluated activity (e.g., recruitment strategies, screening) that is employed frequently by the program. The CDC Framework for EvaluationStep 3: Focus the Evaluation • You should first begin addressing: • Any new initiative with resources allocated to it. • Any activity that consumes a high amount of resources. • Activities that are not successful at meeting their measures of success. • Program inconsistencies to explore why they exist. • Any unevaluated activity (e.g., recruitment strategies, screening) that is employed frequently by the program.

  47. The CDC Framework for EvaluationStep 3: Focus the Evaluation Program Reflection What are key evaluation questions that you want answered? Identify: • the stakeholders of the evaluation findings (e.g., program managers, CDC, providers), • what do they need to know, and • how they will use that information. Click here to go to a Program Reflections Word document to work through as you go through this module.

  48. Focus the Evaluation Case Study Think about what your outcome measure would be for the provision of timely and appropriate diagnostic services to women receiving abnormal breast or cervical cancer screening results (follow-up). Here are activities that your program does to help track this outcome: • Reviewing weekly all abnormal breast and cervical screening results to determine appropriate referral and follow-up status. • Logging the name and total number of patient records for women with abnormal results. • Assigning case management for every client with an abnormal screening. • Case managing women to ensure appropriate tracking, referral and follow-up.

  49. Focus the Evaluation Case Study Due to the fact that screening and diagnostic services are a large component of your program, you may want to focus part of your evaluation plan on how to improve it. From your workplan, your goal was to ensure that all of your clients with abnormal screening tests results receive timely diagnostic examinations.The measure of effectiveness that you indicated was that: • At least 75% of participating women will receive a final diagnosis within 60 days after an abnormal Pap smear • At least 75% of participating women will receive a final diagnosis within 60 days after an abnormalmammography resultLet’s say you are interested in whether you are meeting your screening targets that were set in your workplan. You ask the question:To what extent has your program met NBCCEDP standards for timeliness of follow-up? You would then need to determine how and what information needs to be collected to answer these questions. We will cover this in the next section.

More Related