1 / 41

Program Evaluation Strategies to Improve Teaching for Learning

Program Evaluation Strategies to Improve Teaching for Learning. Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement Conference 2008. Introductions Overview for the session. Evaluation comfort levels. Research driven school design

butlerr
Download Presentation

Program Evaluation Strategies to Improve Teaching for Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement Conference 2008

  2. IntroductionsOverview for the session

  3. Evaluation comfort levels

  4. Research driven school design Rigor, relevance, relationships

  5. Start and end with data

  6. Evaluation can provide information about • Definition and evidence of the problem (needs assessment) • Contextual information about the factors related to the problem and solutions • The input factors and resources • The interventions, processes and strategies employed • Outcomes, results and effectiveness

  7. Program evaluation can be a form of action research

  8. Elements of a sound evaluation • Clear statement of what the project is intended to do and why • Needs assessment • Theory of action • Clear, measurable statement of goal attainment • Appropriate evaluation methods and tools • Transparency

  9. Evaluation can lay the ground work for accountability

  10. Key concepts • Identify the total system impact of program and financial decisions • Institutions led by data and evidence of results

  11. Types of Data • Achievement • Demographic • Program • Perception • Process • Costs

  12. Qualities of Data & Information • Time series, repeated data collection– how does the effect change over time? • Cohort analysis – overall effect on one group over time • Benchmarking & standards – makes the data relative, establishes context and comparables

  13. Triangulation – looking for information in multiple indicators • Pattern & trend analysis • Leading & lagging indicators

  14. Practical matters • Accuracy • Reliability • Validity • Accessibility

  15. Evaluation Employ evaluation strategies from the very beginning of a project and assemble and review effectiveness data. State measurable outcomes, document processes and review progress throughout the life of the project.

  16. “Where outcomes are evaluated without knowledge of implementation, the results seldom provide a direction for action because the decision maker lacks information about what produced the outcomes (or lack of outcomes).” Michael Quinn PattenQuoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

  17. Central Evaluation Questions • Did the program / project do what was intended? • Did the project stick to the plan? There may be valid reasons for varying from the plan – if so what are they?

  18. What is the theory of change for the project? • Why is this project being carried out this way and why is it judged to be the most appropriate way? • These questions are important because they help when judging the impact of changes in the plan.

  19. What is the context for the project? • Issues include past trends, local politics, resource distribution, threats, opportunities, strengths and weaknesses

  20. What actually happened during the course of the project? • Who was served by the project? Why? • Who was not served by the project? Why? • Was the project valued by the intended audience?

  21. What were the inputs, resources – both real e.g., financial and material – and intellectual? • What was the “cost” of the project?

  22. What were the results /outcomes? • Were goals and objectives met? • What were the intended and unintended consequences? • What was the impact on the overall system? • Was there a process impact – did the project result in a change in the way that business is done?

  23. A successful project begins and ends with a good evaluation design

  24. How do these questions improve teaching for learning?

  25. The best and most “sticky”, lasting interventions have the following components • They are based in research and evidence • They are locally constructed and customized

  26. They are targeted to a clear view of the problem to be solved • They are built for sustainability • They are “owned” – not just a matter of compliance • They are designed to build local capacity

  27. Understand change and apply research about change to improve teaching for learning Change takes trust Change takes building relationships Change takes endurance (time) Change takes knowledge of research

  28. Examine policies and practices that serve as barriers and those that serve as catalysts to achievement

  29. Evaluate Audit the system Measure results Change to goal not just awareness or implementation

  30. Meta Evaluation Use meta evaluation strategies to look for results across projects and interventions

  31. Evaluation readiness • Identify and clearly state program goals and outcomes • Transform these goals into measurable objectives • Define program theory and supporting research

  32. Develop the formative and summative evaluation plan • Develop the plan to gather data, deploy evaluation resources and gather information from and report to stakeholders

  33. Design for continuous feedback and transparency

  34. Consider system evaluation policies and expectations • At the time of proposal initiatives, programs and projects should be designed to include program evaluation

  35. “It’s easy to make judgments – that’s evaluation. It’s easy to ask questions about impact – that’s evaluation. It’s easy to disseminate reports – that’s evaluation. What’s hard is to put all those pieces together in a meaningful whole which tells people something they want to know and can use about a matter of importance. That’s evaluation.” HalcolmQuoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

  36. Program evaluation planning document

  37. In the beginning you think. In the end you act. In between you negotiate the possibilities. Some people move from complexity to simplicity and on into catastrophe. Others move from simplicity to complexity and onward into full scale confusion. Simplification makes action possible in the face of overwhelming complexity. It also increases the odds of being wrong. The trick is to let a sense of simplicity inform our thinking, a sense of complexity inform our actions, and a sense of humility inform our judgments…” Michael Quinn Patten (p. 143 Bernhardt)Quoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

  38. Resources • Center for Evaluation & Education Policy www.ceep.indiana.edu • The Evaluation Center; Western Michigan University

  39. More Resources • What Works in Schools Translating Research into Action by Robert J. Marzano • Data Analysis for Comprehensive Schoolwide Improvement; Victoria L. Bernhardt

  40. More Resources • The “Data Wise” Improvement Process: Eight steps for using test data to improve teaching and learning, by Kathryn Parker Boudett et al in Harvard Education Letter, January/February 2006 Volume 22, Number 1.

  41. Rossi Ray-Taylor, PhD rossi@raytaylorandassoc.org Nora Martin, PhDnora@raytaylorandassoc.org Ray.Taylor and Associates 2160 S. Huron Parkway, Suite 3 Ann Arbor, Michigan 48104 www.raytaylorandassoc.org

More Related