1 / 14

Evaluation Presentation

Evaluation Presentation. How does evaluation affect your program? L&D Associates. Evaluation Fact What is evaluation? Why evaluate? A menu of options Blended approaches for maximizing success, learning, and accountability Conclusions. Contents. Evaluation Fact.

oshin
Download Presentation

Evaluation Presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Presentation How does evaluation affect your program? L&D Associates

  2. Evaluation Fact What is evaluation? Why evaluate? A menu of options Blended approaches for maximizing success, learning, and accountability Conclusions Contents

  3. Evaluation Fact • Many millions of dollars are spent each year in the non-profit sector on programs designed to improve social conditions • BUT few of these programs are evaluated to find out whether those dollars made any difference in the community!

  4. Evaluation determines quality or value : How well is it working? How can it be improved? Is it the best intervention/strategy to address this need? Trained evaluators have many useful skills: Needs assessment/ problem diagnosis Designing evaluation systems Conducting evaluations Teaching staff evaluation skills What is evaluation?

  5. Why Evaluation? • Quality improvement • Streamlining a new program • Making a mature program better • Demonstrating value • For accountability • To justify budget requests • Learning • Experimenting with new approaches • Understanding why a previous approach worked or didn’t work

  6. Evaluation Options • Self-evaluation/quality management • Staff evaluate their own program • Facilitated self-evaluation • An evaluation consultant/coach helps program staff evaluate their own program • External evaluation • An independent evaluator (from outside the organization) reviews the program

  7. Self-evaluation • Managerial responsibility for quality • Every manager is responsible for the quality or effectiveness of his/her own program or department Evaluation is an integral part of managing a program • Leadership – helping staff think about quality • Evaluation helps staff think about the purpose of their work, and what it means to create value • Innovation and experimentation • Quality improvement = experimenting with new and innovative ideas/methods for adding value • Evaluation = finding out which of those ideas/methods worked best, and should be implemented more widely

  8. Facilitated self-evaluation • To enhance evaluation knowledge and expertise within the program • Facilitated self-evaluation provides an excellent opportunity for “learning by doing” • To set up a good self-evaluation system • A facilitated self-evaluation can be used to set up a system that can later be used by staff without outside expertise • As an organizational change intervention • Participation in a facilitated self-evaluation process can help change organizational culture (thinking and behavior)

  9. External evaluation • A fresh set of eyes • What are our self-evaluation processes missing? • Especially useful for finding unexpected results/ripple effects, and/or new ways of thinking about the program • A source of new ideas • External evaluation consultants have often seen many programs, and can bring “best practice” ideas from what they have seen elsewhere • Independence • Someone with no vested interest in the program  less biased toward looking for a particular result • Important for accountability (objectivity)

  10. Different Approaches at Different Stages Program Planning Needs assessment Baseline data Evaluation design Bring in evaluation expert  to help with planning and baseline data collection Program Implementation Fine-tuning/streamlining Experimenting with different methods Asking more evaluation questions Facilitated self-evaluation (to build evaluation skills); then ongoing self-evaluation Program Maturity Full, formal evaluation Comparisons with other programs Learning for future program planning External evaluation: ideas for improvement plus accountability

  11. Phasing in External Evaluation Advice, Support, and Skill Building External evaluation specialist trains and assists internal evaluators External Meta-Evaluation External evaluation specialist reviews internal evaluation reports,makes suggestions for improvement Combination Internal-External Evaluation External evaluator supplements internal evaluation findings with independent investigation of key areas Fully Independent External Evaluation External evaluator conducts fully independent review of program

  12. Building Interest in Evaluation More Useful Feedback onProgram Effectiveness Seeking OutNew (External)Perspectives Improved ProgramPerformance EnthusiasmAbout FurtherImprovement Confidence in Program Quality

  13. Concluding Comments • Use a mix of approaches • Approaches are useful at different stages • They are a powerful combination for enhancing success, learning, and accountability • Build evaluation capacity gradually • Build internal evaluation skills thatimprove program performanceincrease confidence in quality • Develop new perspectives and ideas throughvaluing the ‘external eye’100% positive feedback = not enough innovation!

  14. Acknowledgements L&D Associates thanks Dr Jane Davidson for providing the basis for this presentation. E. Jane Davidson, Ph.D. The Evaluation Center, Western Michigan University http://homepages.wmich.edu/~jdavidso

More Related