1 / 55

Using sound evaluation practices to support your project’s success

Evaluation for VISTAs. Using sound evaluation practices to support your project’s success . By Sara McGarraugh Improve Group Research Analyst. Let’s get to know each other. Overview. Purpose and benefits of evaluation Evaluation tools that you can use Designing an evaluation.

mieko
Download Presentation

Using sound evaluation practices to support your project’s success

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation for VISTAs Using sound evaluation practices to support your project’s success By Sara McGarraugh Improve Group Research Analyst

  2. Let’s get to know each other

  3. Overview • Purpose and benefits of evaluation • Evaluation tools that you can use • Designing an evaluation

  4. What is evaluation? “In reality, (nonprofit) performance is all about translating caring, believing, and compassion into results.” • Letts, Ryan & Grossman High Performance Nonprofit Organizations 1999

  5. What is evaluation? “ Assessing strengths and weaknesses …to improve effectiveness ”

  6. Why Evaluation?

  7. Demonstrate results *Fake data is used to create this dashboard

  8. Reflect

  9. Create a learning community

  10. Influence others

  11. Engage stakeholders

  12. Plan for sustainability

  13. What are the benefits of evaluation? • Knowledge and reflection • Information to share with stakeholders • Good evaluation practices are also good program management practices

  14. Barriers to Evaluation • Humans resist change • We want to get along • Evaluation might suggest we are doing something wrong or would need to change • Time and cost

  15. The Role of Evaluation Formative evaluation: How is the program delivered and is it meeting expectations?

  16. The Role of Evaluation Summative evaluation: Did the program meet its goals?

  17. The evaluation process

  18. How are project and evaluation phases similar?

  19. Putting Evaluation to practice

  20. Start up: Define Purpose • Who are the key stakeholders of the evaluation? • What you hope to learn as part of the evaluation • Prioritize competing interests

  21. Design & Planning: Refine goals S • What are your program’s aspirations (your goals)? • Goals should be SMART M A R T

  22. Design & Planning: Define project outcomes What is the CHANGE you wish to see?

  23. Practice: refining goals and outcomes

  24. Logic models: a basic evaluation tool A Logic Model can describe how your program’s goals and activities lead to results and how to measure them

  25. How are they useful? • Help get everyone on the same page • Encourage investment and buy-in • Facilitate organization • Provide clear and concrete guidelines • Serve as a roadmap throughout evaluation process • Useful for grant proposals

  26. What does it show? OUTPUTS GOALS ACTIVITIES INTENDED OUTCOMES MEASURES

  27. Definitions • Activities: • What your program does • Outputs: • Countable products showing an activity occurred (evidence) • Outcomes: • Benefit received from your program • Knowledge, behavior, condition changes

  28. Example GOAL:For all students to be at or above their grade reading level OUTPUT: # of tutoring sessions # of books/articles read ACTIVITIES:-Tutoring -Theater -Book club INTENDED OUTCOMES:-Students are more confident in reading -Students think reading is fun MEASURES:-Survey -Check-in form

  29. An alternative to a logic model

  30. Practice: Logic model development

  31. Break

  32. Is it a tool, instrument, or protocol? Designing data collection tools

  33. Design measures and tools • What type of evidence will demonstrate outcomes? • Design evaluation tools

  34. Traditional data-gathering strategies

  35. Consider creative ways to gather data

  36. Matching evaluation tools to your work • Use available resources • Use knowledge of staff • Honor the wisdom of all of your stakeholders • Supported by infrastructure

  37. Practice: Data collection tool design

  38. Say what? If the question is a scale… “How valuable was your involvement with Sample Program?” • Very valuable • Somewhat valuable • Not valuable

  39. Then, you can report outcome statements like: 65% of all respondents found their involvement with Sample Program to be very valuable – OR – Over half of respondents found their involvement with Sample Org to be very valuable. 80% of all respondents found their involvement with Sample Program to be somewhat or very valuable.

  40. Say what? If the question is open-ended… “What was the most valuable part of volunteering for Sample Program?”

  41. Then, you can report outcome statements like: Respondents frequently reported community engagement, connecting with participants, and learning about issues in the community as the most valuable part of volunteering with Sample Program.

  42. Say what? If the question is a retrospective pre-test… “Please rate your opinion of the importance of volunteerism before AND after participating with Sample Program.”

  43. Then, you can report outcome statements like: “85% of respondents rated volunteerism as very important after participating with Sample Organization compared to only 42% before participating.”

  44. Types of questions What are the pros and cons of these question types?

  45. Focus on simple and effective Photo by Heather McQuaid

  46. Use parameters to make responding easier Photo by Oli Shaw

  47. Avoid double-barreled questions, using jargon, or vague questions Photo by Nate Bolt

  48. Practice: Survey or interview Question design

  49. Implementation: Data gathering • Smile! • Practice in advance • Have a contact person for questions • Respect time & privacy • Create instructions

  50. Survey administration • Timing • Response rates

More Related