1 / 72

&%$#@&%$!!

&%$#@&%$!!. Evaluation is NOT a Dirty Word Kathleen Dowell, Ph.D. EvalSolutions Epilepsy Foundation: Best Practices Institute September 29, 2012 Denver, Colorado. Too expensive. Too complicated. Too time consuming. Not a priority. Just don’t know where to start. Barriers.

reeves
Download Presentation

&%$#@&%$!!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. &%$#@&%$!! Evaluation is NOT a Dirty Word Kathleen Dowell, Ph.D. EvalSolutions Epilepsy Foundation: Best Practices Institute September 29, 2012 Denver, Colorado

  2. Too expensive

  3. Too complicated

  4. Too time consuming

  5. Not a priority

  6. Just don’t know where to start

  7. Barriers • Lack of research/statistics skills • Lack of time • Lack of resources • Other priorities • Lack of incentive • Fear • Don’t see value

  8. What is Evaluation? The process of determining the merit, worth, or value of a program (Scriven, 1991)

  9. What is Evaluation? Systematic inquiry that describes and explains, policies’ and programs’ operations, effects, justifications, and social implications (Mark, Henry, & Julnes, 2000)

  10. What is Evaluation? The systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention programs (Rossi & Freeman, 1989)

  11. In simpler terms….. Collection of information to determine the value of a program eVALUation

  12. Evaluation is NOT…. • Auditing • Personnel assessment • Monitoring (although this can be part of an evaluation process) • Used to end or shut down programs

  13. Evaluation Myth #1 Evaluation is an extraneous activity that generates lots of boring data with useless conclusions

  14. Evaluation Myth #2 Evaluation is about proving the success or failure of a program

  15. Evaluation Myth #3 Evaluation is a unique and complex process that occurs at a certain time in a certain way, and almost always includes the use of outside experts.

  16. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts

  17. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources

  18. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources • Document program accomplishments

  19. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources • Document program accomplishments • Justify current program funding

  20. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources • Document program accomplishments • Justify current program funding • Support need for increased funding

  21. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources • Document program accomplishments • Justify current program funding • Support need for increased funding • Satisfy ethical responsibility to clients to demonstrate positive and negative effects of participation

  22. How Can Evaluation Help You? • Demonstrate program effectiveness or impacts • Better manage limited resources • Document program accomplishments • Justify current program funding • Support need for increased funding • Satisfy ethical responsibility to clients to demonstrate positive and negative effects of participation • Document program development and activities to help ensure successful replication

  23. Ultimately… To improve program performance which leads to better value for your resources

  24. No Evaluation Means…. • No evidence that your program is working or how it works

  25. No Evaluation Means…. • No evidence that your program is working or how it works • Lack of justification for new or increased funding

  26. No Evaluation Means…. • No evidence that your program is working or how it works • Lack of justification for new or increased funding • No marketing power for potential clients

  27. No Evaluation Means…. • No evidence that your program is working or how it works • Lack of justification for new or increased funding • No marketing power for potential clients • Lack of credibility

  28. No Evaluation Means…. • No evidence that your program is working or how it works • Lack of justification for new or increased funding • No marketing power for potential clients • Lack of credibility • Lack of political and/or social support

  29. No Evaluation Means…. • No evidence that your program is working or how it works • Lack of justification for new or increased funding • No marketing power for potential clients • Lack of credibility • Lack of political and/or social support • No way to know how to improve

  30. Program Life Cycle

  31. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative)

  32. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative) • Outcomes

  33. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative) • Outcomes • Indicators

  34. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative) • Outcomes • Indicators • Measures

  35. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative) • Outcomes • Indicators • Measures • Benchmarks

  36. Basic Terminology • Types of Evaluation • Outcome (summative) • Process (formative) • Outcomes • Indicators • Measures • Benchmarks • Quantitative vs. qualitative

  37. Evaluation Process

  38. Engage Stakeholders • Those involved in program design, delivery, and/or funding • Those served by the program • Users of the evaluation results

  39. Clearly Define Program • Resources, activities, outcomes • Context in which program operates • Logic model • Explicit connections between “how” and “what” • Helps with program improvement • Good for sharing program idea with others • Living, breathing model

  40. IF THEN

  41. IF THEN I take an aspirin

  42. IF THEN My headache will go away I take an aspirin

  43. IF = Inputs & Activities THEN = Outcomes

  44. Written Evaluation Plan • Outcomes • Indicators • Tools • Timelines • Person(s) responsible (optional)

  45. Sample Evaluation Plan

  46. Credible Data Collection Tools • Valid and reliable tools • Valid=measures what it is intended to measure • Reliable=consistent results over time • Qualitative • Quantitative • Will answer your evaluation questions and inform decision-making

  47. Collect Credible/Useful Data • Quantitative • Surveys • Tests • Skill assessments • Qualitative • Focus groups • Interviews • Journals • Observations

  48. Analyze Data • Many methods • Answer evaluation questions • Engage stakeholders in interpretations • Justify conclusions and recommendations • Get help if needed!

  49. Share/Use Results • Reporting format • Getting results into the right hands • Framing the results • Collaborative vs. confrontational approach • Keeping users “in the loop” • Debriefs and follow-up

More Related