1 / 12

Implementing Your Evaluation Plan So You Have A Plan Now what

yves
Download Presentation

Implementing Your Evaluation Plan So You Have A Plan Now what

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Implementing Your Evaluation Plan So You Have A Plan……Now what? Rachel Barron-Simpson, MPH CDC/NCCDPHP Division for Heart Disease and Stroke Prevention WISEWOMAN Annual Meeting September 14, 2009

    2. Evaluation Purpose Continuous quality improvement of WISEWOMAN program services Ensure quality care for clients Funded programs should use their program evaluation to identify areas for continuous quality improvement of program services and to ensure quality care for participants. The WISEWOMAN Program Guidance and Resource Document contains detailed information about WISEWOMAN funded program evaluation requirements. The funded program evaluation requirements that are specified in this evaluation overview document are purposely broad and are intended to be flexible. CDC understands that not all funded programs are alike and that each program may need to implement slightly different evaluation activities depending on its specific evaluation needs. The purpose of funded program evaluation is to improve the program; therefore, programs should consider their evaluation needs in relation to the program requirements.Funded programs should use their program evaluation to identify areas for continuous quality improvement of program services and to ensure quality care for participants. The WISEWOMAN Program Guidance and Resource Document contains detailed information about WISEWOMAN funded program evaluation requirements. The funded program evaluation requirements that are specified in this evaluation overview document are purposely broad and are intended to be flexible. CDC understands that not all funded programs are alike and that each program may need to implement slightly different evaluation activities depending on its specific evaluation needs. The purpose of funded program evaluation is to improve the program; therefore, programs should consider their evaluation needs in relation to the program requirements.

    3. WISEWOMAN Evaluation Plan Template Re-iterate that this is the template for their evaluation plans. Explain that the template is based on the CDC evaluation framework. Point out that the template evolved from the WISEWOMAN program evaluation overview document. Stress the importance of revisiting the evaluation design and plan before beginning the evaluation. Re-iterate that this is the template for their evaluation plans. Explain that the template is based on the CDC evaluation framework. Point out that the template evolved from the WISEWOMAN program evaluation overview document. Stress the importance of revisiting the evaluation design and plan before beginning the evaluation.

    4. Revisiting Your Evaluation Plan Go through the steps of the CDC evaluation framework. Explain that they are the basis for the template from which they have developed their evaluation plans. CDC framework for PE, which has been around for 10 years, guides our evaluation efforts here at the CDC. Encourage our funded programs to use the framework because it provides a systematic approach to evaluation that is integrated with routine program operations. The emphasis is on practical, ongoing evaluation strategies that involve program stakeholders The framework includes six steps, starting at the top of the circle: engage stakeholders, describe the program, focus the evaluation design, gather credible evidence, justify conclusions, an ensure use and share lessons learned. In this presentation will walk through the steps and give you some practical tips for implementing them in your evaluations. You will also notice that the framework includes four standards – people can often overlook them, but these should help guide your decision making as you employ the six steps. Utility standards ensure that information needs of evaluation users are satisfied Feasibility standards ensure that the evaluation is viable and pragmatic (executable) Propriety standards ensure that the evaluation is ethical (i.e., conducted with regard for the rights and interests of those involved and effected) Accuracy standards ensure that the evaluation produces findings that are considered correct. Implementing these standards is an active process; requires you to consider them at each step in the evaluation Go through the steps of the CDC evaluation framework. Explain that they are the basis for the template from which they have developed their evaluation plans. CDC framework for PE, which has been around for 10 years, guides our evaluation efforts here at the CDC. Encourage our funded programs to use the framework because it provides a systematic approach to evaluation that is integrated with routine program operations. The emphasis is on practical, ongoing evaluation strategies that involve program stakeholders The framework includes six steps, starting at the top of the circle: engage stakeholders, describe the program, focus the evaluation design, gather credible evidence, justify conclusions, an ensure use and share lessons learned. In this presentation will walk through the steps and give you some practical tips for implementing them in your evaluations. You will also notice that the framework includes four standards – people can often overlook them, but these should help guide your decision making as you employ the six steps. Utility standards ensure that information needs of evaluation users are satisfied Feasibility standards ensure that the evaluation is viable and pragmatic (executable) Propriety standards ensure that the evaluation is ethical (i.e., conducted with regard for the rights and interests of those involved and effected) Accuracy standards ensure that the evaluation produces findings that are considered correct. Implementing these standards is an active process; requires you to consider them at each step in the evaluation

    5. Review Evaluation Stakeholders Are the stakeholders you have identified still the key users for your findings? Keep your list of stakeholders focused. Should any stakeholders be added, removed, or re-prioritized? UTILITY STANDARD Persons involved in or affected by the evaluation should be identified, so that their needs can be addressed. Make use of those individuals that would benefit from this evaluation ------------------- Engaging stakeholders involves including the sponsors as a part of the group of primary intended users Ask whether the stakeholders they have identified are still current. Ask them to consider who will ultimately use the evaluation findings. Explain the importance of focusing and prioritizing the list of stakeholders based on eventual use of the evaluation; a long list of stakeholders can lay the groundwork for an unfocused or unwieldy evaluation. Emphasis should be on who needs the information, who will use the findingsUTILITY STANDARD Persons involved in or affected by the evaluation should be identified, so that their needs can be addressed. Make use of those individuals that would benefit from this evaluation ------------------- Engaging stakeholders involves including the sponsors as a part of the group of primary intended users Ask whether the stakeholders they have identified are still current. Ask them to consider who will ultimately use the evaluation findings. Explain the importance of focusing and prioritizing the list of stakeholders based on eventual use of the evaluation; a long list of stakeholders can lay the groundwork for an unfocused or unwieldy evaluation. Emphasis should be on who needs the information, who will use the findings

    6. Review Program Description Does the description adequately orient the evaluator to your program? Are the activity descriptions and narrative still accurate and current? Standard ACCURACY The program being evaluated should be described and documented clearly and accurately, so that the program is clearly identified. The context in which the program exists should be examined in enough detail, so that its likely influences on the program can be identified. Assure that the description is concise and accurate. Make certain that the information is still current. The description should adequately describe the program so that someone who has no experience of it can understand how it works.Standard ACCURACY The program being evaluated should be described and documented clearly and accurately, so that the program is clearly identified. The context in which the program exists should be examined in enough detail, so that its likely influences on the program can be identified. Assure that the description is concise and accurate. Make certain that the information is still current. The description should adequately describe the program so that someone who has no experience of it can understand how it works.

    7. Review Evaluation Design (Purpose of the Evaluation) Has the purpose of your evaluation changed since you designed your evaluation? Do you still intend to use the findings in the same way you had planned to originally? Standard UTILITY The persons conducting the evaluation should be both trustworthy and competent to perform the evaluation, so that the evaluation findings achieve maximum credibility and acceptance. FEASIBILITY The evaluation procedures should be practical, to keep disruption to a minimum while needed information is obtained. PROPRIETY Evaluation should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. ACCURACY The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed There are several possible purposes for evaluation; their purpose is likely to have to do with program improvement. Their evaluation might also have the purpose of showing the funder what they are doing. Are their plans for the evaluation findings (who will use the findings, for what purpose) still the same as when they designed the eval?Standard UTILITY The persons conducting the evaluation should be both trustworthy and competent to perform the evaluation, so that the evaluation findings achieve maximum credibility and acceptance. FEASIBILITY The evaluation procedures should be practical, to keep disruption to a minimum while needed information is obtained. PROPRIETY Evaluation should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. ACCURACY The purposes and procedures of the evaluation should be monitored and described in enough detail, so that they can be identified and assessed There are several possible purposes for evaluation; their purpose is likely to have to do with program improvement. Their evaluation might also have the purpose of showing the funder what they are doing. Are their plans for the evaluation findings (who will use the findings, for what purpose) still the same as when they designed the eval?

    8. Review Evaluation Design (Evaluation Questions) Are these still the questions you wish to ask? Are there questions you wish to add or remove? Are the questions (and the number of questions) realistic? Stress the importance of asking the right questions. No one has the resources to ask all the questions they want. Questions should be realistic and answerable. It may be necessary to scale back due to limited resources and time.Stress the importance of asking the right questions. No one has the resources to ask all the questions they want. Questions should be realistic and answerable. It may be necessary to scale back due to limited resources and time.

    9. Review Data Collection Plan Standard UTILITY Information collected should be broadly selected to address pertinent questions about the program and be responsive to the needs and interests of clients and other specified stakeholders. PROPRIETY Evaluators should respect human dignity and worth in their interactions with other persons associated with an evaluation, so that participants are not threatened or harmed ACCURACY The information collected, processed, and reported in an evaluation should be systematically reviewed and any errors found should be corrected. Ask whether the data source(s) is (are) still available and accessible. Determine how many people and how much time will be necessary for collecting the data. If a survey instrument or interview protocol will be needed, make a plan for drafting one. Keep surveys and interviews to a manageable length.Standard UTILITY Information collected should be broadly selected to address pertinent questions about the program and be responsive to the needs and interests of clients and other specified stakeholders. PROPRIETY Evaluators should respect human dignity and worth in their interactions with other persons associated with an evaluation, so that participants are not threatened or harmed ACCURACY The information collected, processed, and reported in an evaluation should be systematically reviewed and any errors found should be corrected. Ask whether the data source(s) is (are) still available and accessible. Determine how many people and how much time will be necessary for collecting the data. If a survey instrument or interview protocol will be needed, make a plan for drafting one. Keep surveys and interviews to a manageable length.

    10. Review Data Analysis Plan How will you record and store the data? Will you need to code the data? What is your protocol? Will you use a spreadsheet? Database? Special software? Standard UTILITY The perspectives, procedures, and rationale used to interpret the findings should be carefully described, so that the bases for value judgments are clear. ACCURACY Quantitative and qualitative information in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered Reiterate that data analysis is about treating the data; this is different from gathering it. Will they need to develop a spreadsheet, a log, a simple database, or other means of compiling the data? What is their analysis plan? Does it involve coding the data? Will there be any mathematical analysis, such as sums or percentages? How will they ensure data entry and analysis accuracy? Standard UTILITY The perspectives, procedures, and rationale used to interpret the findings should be carefully described, so that the bases for value judgments are clear. ACCURACY Quantitative and qualitative information in an evaluation should be appropriately and systematically analyzed so that evaluation questions are effectively answered Reiterate that data analysis is about treating the data; this is different from gathering it. Will they need to develop a spreadsheet, a log, a simple database, or other means of compiling the data? What is their analysis plan? Does it involve coding the data? Will there be any mathematical analysis, such as sums or percentages? How will they ensure data entry and analysis accuracy?

    11. Consider A Timeline Consider using a Gantt chart or other means of showing the timeline for your evaluation and who is responsible for what. Something to consider though it is not required and you have the timing column and staffing plan, a Gantt chart or similar tool could be helpful for managing the evaluation. It would indicate a timeline for data collection, analysis, and reporting, as well as an indication of who is responsible for each element of the evaluation. Something to consider though it is not required and you have the timing column and staffing plan, a Gantt chart or similar tool could be helpful for managing the evaluation. It would indicate a timeline for data collection, analysis, and reporting, as well as an indication of who is responsible for each element of the evaluation.

    12. Review Communication Plan What will your report look like? Who will receive the findings? Create a short (2-page) summary piece. Standard UTILITY Evaluation reports should clearly describe the program being evaluated, including its context, and the purposes, procedures, and findings of the evaluation, so that essential information is provided and easily understood. Significant interim findings and evaluation reports should be disseminated to intended users, so that they can be used in a timely fashion ACCURACY The conclusions reached in an evaluation should be explicitly justified, so that stakeholders can assess them What will the report look like, and who will prepare it? How long will it be? Who will receive the report, and is the format appropriate to the recipient? Be sure to create a short (2-page) summary that captures the essential program details and the findings of the evaluation.Standard UTILITY Evaluation reports should clearly describe the program being evaluated, including its context, and the purposes, procedures, and findings of the evaluation, so that essential information is provided and easily understood. Significant interim findings and evaluation reports should be disseminated to intended users, so that they can be used in a timely fashion ACCURACY The conclusions reached in an evaluation should be explicitly justified, so that stakeholders can assess them What will the report look like, and who will prepare it? How long will it be? Who will receive the report, and is the format appropriate to the recipient? Be sure to create a short (2-page) summary that captures the essential program details and the findings of the evaluation.

    13. Questions

More Related