1 / 45

NSP Evaluation Roundtable

NSP Evaluation Roundtable. Nonprofit Evaluation Basics October 11, 2018. Goals. Promote ongoing nonprofit discussion, sharing and learning about evaluation Provide new and relevant information to meet nonprofit evaluation needs

mei
Download Presentation

NSP Evaluation Roundtable

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NSP Evaluation Roundtable Nonprofit Evaluation Basics October 11, 2018

  2. Goals • Promote ongoing nonprofit discussion, sharing and learning about evaluation • Provide new and relevant information to meet nonprofit evaluation needs • Provide a community of practice for area nonprofit and evaluation professionals

  3. Nonprofit Evaluation BasicsAnita Baker, Ed.D.Evaluation Services

  4. Agenda 8:30 – 9:20 Introductions and Evaluation Overview 9:20 – 10:05 Collecting and Analyzing Evaluation Data, Part 1 10:05 – 10:15 Break and Networking 10:15 – 11:35 Collecting and Analyzing Evaluation Data, Part 2 11:35 – 11:45 Evaluation Reporting, Final Advice 11:50 – 12:00 Closure

  5. Working Definition of Program Evaluation Thoughtful, systematic collection and analysis of information about activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, inform decisions. Evaluation Essentials Training: Anita M. Baker, Evaluation Services 1

  6. Getting Started • Why evaluate? • What to evaluate? • MORE SPECIFICALLY, what are the key Evaluation Questions to address? • How to evaluate? • Who should evaluate? • When to evaluate? • How to use the findings? 2

  7. Evaluation Strategy Clarification  All Evaluations Are: • Partly social • Partly political • Partly technical  Both qualitative and quantitative data can be collected and used and both are valuable.  There are multiple ways to address most evaluation needs.  Different evaluation needs call for different designs, data and data collection strategies. 3

  8. What do you need to do to conduct Evaluation? Specify key evaluation questions Specify an approach (evaluation design) Apply evaluation logic Collect and analyze data Summarize and share findings 4

  9. Evaluation Questions • Focus and drive the evaluation. • Should be carefully specified and agreed upon in advance of other evaluation work. • Generally represent a critical subset of information that is desired. 5

  10. Evaluation Question Criteria It is possible to obtain data to address the questions. There is more than one possible “answer” to the question. The information to address the questions is wanted and needed. It is known how resulting information will be used internally (and externally). The questions are aimed at changeable aspects of activity. 6

  11. Evaluation Question Examples To what extent did the program work (attain goals)? Should the program be continued/ended? Was implementation in compliance? Were funds used appropriately, for intended purposes? To what extent were desired client outcomes achieved? What are the programs strengths and weaknesses? How and to what extent are participants progressing toward desired outcomes? Which types of participants are making good progress and which aren’t? What kinds of implementation problems have emerged, and how are they addressed? What’s happening that wasn’t expected? What are staff and participant perceptions of the program? 7

  12. What is an Evaluation Design? An evaluation design is a written summary that communicates key decisions/plans to evaluators, program officials and other stakeholders. Evaluation designs help evaluators and their partners think about and structure evaluation work. Evaluation Essentials Training: Anita M. Baker, Evaluation Services 8

  13. Good Evaluation Designs Include: Summary information about the program  The questions to be addressed by the evaluation  The data collection strategies that will be used  The individuals who will undertake the activities  When the activities will be conducted  The products of the evaluation (who will receive them and how they should be used)  Projected costs to do the evaluation Workplan 9

  14. Program Planning Includes Evaluation • Think about the results you want. • Decide what strategies will help you achieve those results? • Think about what inputs you need to conduct the desired strategies. • Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results. 10

  15. Outcomes and Indicators • Changes or levels of behavior, skills, knowledge, attitudes, condition or status. • Specific, measurable characteristics or changes that represent achievement of an outcome. 11

  16. Indicator: Reminders • Many outcomes have more than one indicator • Identify the set of indicators that accurately signal achievement of an outcome (get stakeholder input) 12

  17. Targets Specify the amount or level of outcome attainment expected, hoped for or required. Targets can be set. . . . • Relative to external standards (when available) • Past performance/similar programs • Professional hunches 13

  18. Target: Reminders • Should be specified in advance. Requires buy in. • Carefully word targets so they are not over or under-ambitious, make sense, and are in sync with time frames. • If target indicates change in magnitude – be sure to specify initial levels and what is positive. 14

  19. Outcome, Indicator, Target - EXAMPLE Participants will attend 70% or more of all available sessions. At least half of participants will participate in 100 or more hours per quarter. 15

  20. Outcome, Indicator, Target - EXAMPLE Two-thirds of campers enrolled in swimming will demonstrate competency in 3 basic strokes. Most campers (85%) will demonstrate mastery of all performance dance moves. 16

  21. Things to Think about Before Collecting Data • What are your evaluation questions? • Who is your target group? Where are they? • Do they need assistance to answer? Do you need consent? • What type of data collection strategy will you use? How often? • What are the specific strategies – what day, what time, who’s responsible? Do you need incentives? • How much time do you need to collect data? • How will you keep track of what you collect, store it and maintain confidentiality? 17

  22. Collecting Evaluation Data Interviews Surveys Observations Record Reviews • All have limitations and benefits • Require preparation on the front end: • Instrument Development and testing • Administration plan development • Analysis plan development 18

  23. PRE POST Surveys GRAND CLAIMS USE SURVEYS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Collect some behavioral reports Test knowledge Determine changes over time. • Series of items with pre-determined response choices • Can be completed by administrator or respondents • Can be conducted • “paper/pencil” • phone, internet (e-survey) • using alternative strategies • Instruments are called – surveys, “evaluations,” questionnaires 19

  24. Survey Results Example Table 4: Reasons for Visiting the ICA Note: visitor categories selected for the survey and shown above were informed by research conducted by John Falk, 2009. 20

  25. Survey Results Example Table 6: Planned Actions for Block Party Visitors 21

  26. Survey Results Example , n=106 22

  27. Assessing Survey Instruments • Are questions comprehensive without duplication, exhaustive without being exhausting? • Do answer choices match question stem, provide coverage, avoid overlap? • Are other data needs (e.g., characteristics of respondent) addressed? • Do order and formatting facilitate response? Are directions clear? • Does the survey have face validity? 23

  28. Things to Think about Before Administering a Survey • Target group: who, where, sampling? • Respondent assistance, A/P consent • Type of survey, frequency of administration • Anonymity vs. Confidentiality • Specific fielding strategies (including alternatives), incentives? • Time needed for response • Tracking administration and response • Data analysis plans • Storing and maintaining confidentiality 24

  29. Key Survey Administration Strategies • Captive • Use (trained, multiple) survey administrators • Recruit/inform respondents • Facilitate response (clipboards, deposit boxes, pencils) • Use incentives • Intercept Surveys • Use (trained, multiple) survey administrators • Develop brief and feasible instruments* • * (can include one or multiple sets of questions) • Alternative Surveys • Sticky dots, candy, marbles, relevant symbols • Exhibit based (tablets, manipulatives) • Electronic Surveys • Can be used to administer and/or analyze data • Tablets and handheld strategies available 25

  30. Interviews USE INTERVIEWS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Document program implementation Determine changes over time. • One-sided conversation with questions mostly pre-determined, but open-ended. • Respondent answers in own terms. • Can be conducted • in person • on phone • one-on-one, or groups • Instruments are called – protocols, schedules or guides 26

  31. Observations: USE OBSERVATIONS TO: Document program implementation Witness levels of skill/ability, program practices, behaviors Determine changes over time. • Observations are conducted to view and hear actual program activities. • Users of reports will know what and how events occur. • Can be focused on • programs overall • participants • pre-selected features • Instruments are called – protocols, guides, checklists 27

  32. Observation Results Many different types of arts activities were undertaken, and personal development was either delivered directly or integrated with arts activities. Of the 57 different combinations of programming at the 10 sites, only 3 included activities that were not wholly successful with their target groups, 2 of those because of mismatch between instructor and the participant group. At all sites, ongoing projects were underway and examples of participant work were readily visible. Teaching artists were demonstrating skills, giving youth opportunities to try the skills, and providing one-on-one assistance as needed. 28

  33. Participant Observation, Final Session: Practical Exam Results for Group B (n=9) 29

  34. Strategies for Analyzing Qualitative Data • Segment or partition data (i.e., divide it into meaningful analytical units – “chunks”) • Code data (e.g., + or -) and count it • Organize, summarize and display data – using snippets or quotes • Verify/validate results • Revise summaries and displays accordingly 30

  35. Coding Qualitative Data • A priori or deductive codes: predetermined categories based on accepted theory or program knowledge • Inductive: based on raw data (not predetermined) 31

  36. Record Reviews: USE REC REVIEW TO: Collect some behavioral reports Conduct tests, collect test results Verify self-reported data Determine changes over time • Accessing existing internal information, or information collected for other purposes. • Can be focused on • own records • records of other orgs • adding questions to existing docs • Instruments are called – protocols 32

  37. What Kinds of Data Can you Collect Through Record Reviews? Background information about participants(e.g., race/ethnicity, age, gender, location, family composition) Status information about participants(e.g., whether and how much they are working, what their income levels are, whether they have prior or multiple arrest records, whether they are owed child support) Behavioral data(e.g., program attendance, program service utilization) Test results(e.g., SAT or GRE scores, employment aptitude test scores, medical test results such as immunization status, TB test results) Other outcome data(e.g., report cards, health or psychological assessments, home visit results) 33

  38. Record Review Analysis: Dummy Table 34

  39. Record Review Example: Descriptive 35

  40. Record Review Example: Evaluative 36

  41. Strategies for Analyzing Quantitative Data Important Things to Look at or Summarize Frequencies:How often a response or status occurs Valid Percentages: Frequency/ Adjusted Total *100 Measures of Central Tendency: Mean, Median Distribution:Minimum, Maximum, Groups (*iles) Cross-Tabulations:Relationship between two or more variables (also called contingency analyses, can include significance tests such as chi-square analyses) 37

  42. What happens after data are collected? • Data are analyzed according to plans. Results/findings are summarized. • Findings must be converted into a format that can be shared with others. • Action steps should be developed from findings. “Now that we know _____ we will _____.” 38

  43. Integrate Evaluation into Everyday Practice Participation Review Intercept Interviews Regular Surveys Skills Test Regular Observation 39

  44. Cautions • Don’t shoot a mouse with a cannon! Keep it simple, stick to your plan Look for evidence and understanding not proof/causation • Don’t make grand claims. Be careful with generalizations, only summarize to the actual group • Don’t over-rely on surveys. Be sure to use observations, interviews and record reviews too • Don’t over-collect and under-analyze. Only ask for information you know you want, need and will use. • Don’t look for your watch where the light is better. Be careful using strategies that were designed for other purposes 40

  45. Closure • Thank You! • For further information…visit our website • Evaluation Capacity Grant program • Look for our survey – we want yourfeedback!

More Related