180 likes | 205 Views
Learn how to evaluate the successful implementation of a new program in a school setting. Understand the importance of ongoing monitoring, technical assistance, and teacher follow-up to ensure program effectiveness. Explore evaluation strategies such as observations, interviews, and implementation checklists.
E N D
Were the magic bullets used? • How do you know when teachers put a new program into action? • What kind of follow-up is needed to insure that a staff development workshop actually makes it to the classroom?
Were the magic bullets used? • Ongoing Monitoring • Ongoing Technical Assistance • Mentoring / Modeling / Model Classrooms • Resources / Materials • Incorporate the new program into administrative expectations / supervision / performance evaluation
CIPPI Model • Context • Input • Process • Product • Impact • The goal is to make INFORMED judgments about Program merit or value.
The Nature of Evaluation Tasks • The goal is to make INFORMED judgments about Program merit or value. • Making judgments requires a thorough understanding of the implementation of the treatment. • What was delivered to the target audience?
Process Evaluation • The main function of a process evaluation, or the process evaluation phase of a comprehensive evaluation, is to monitor the implementation of the Program. • You can’t make judgments of program merit before you know if the program reached the target audience.
Evaluation Strategies • Observations • Interviews • Implementation Checklists • Contact / Activity logs • Records Review
The Implementation Plan • Has the plan for Program implementation, reported in the Input evaluation phase, been put into action? • Has the Program made any changes to the original implementation plan?
The Implementation Plan • Has the Implementation plan been successfully communicated to the frontline service delivery staff in a thorough manner? • Has the frontline service delivery staff been trained in a thorough enough manner to lead to complete implementation?
Stakeholder Expectations • What do the stakeholders (especially funding agencies) expect to happen? • What level of accountability for Program implementation do they expect?
Implementation Standards • Do Program implementation standards exist? • Is there an Implementation Fidelity Checklist available? If not, one may need to be developed with the input of stakeholders and Program personnel.
Process Evaluation Plan • Create a plan for documenting the implementation of the Program. • Attendance data, activity logs, contacts records, descriptions of the activities or interactions, descriptions of the content of the services delivered, materials disseminated, etc. • Make a timeline for who collects what and when.
Dose-Response • It may be helpful to develop an Intensity of Intervention Scale. Such scales can be used to examine a “Dose-Response” relationship. • Assist the Program in developing an evaluation information system that will support the steps outlined in the plan.
Traveling Observer Technique • There is no substitute for site visits with a structured agenda, including observations of specific program activities and facilities, and requests for data. • Provide each Observer with a protocol, including checklists and schedules.
Formative Conclusions • Are there any problems with frontline staff compliance with the implementation plan? Do staff activities match the plan? • Are the participants, or target populations, cooperating with the implementation of the Program?
Formative Conclusions • Does the Implementation Fidelity Checklist data demonstrate any problems with poor execution of the plan? • Is the implementation on schedule?
Summative Conclusions • What dosage level (amount of the treatment or Program) was actually received by the participants? • Are there any sub-group differences in dosage level?
Summative Conclusions • Summarize your conclusions about the implementation of the program. • Based on all of the information you have gained from the Process Evaluation Phase, was the treatment delivered? • Will the summative evaluation evidence be a fair test of Program merit or value?