1 / 33

Data Analysis for Evaluation Eric Graig, Ph.D.

Data Analysis for Evaluation Eric Graig, Ph.D. Purpose of this Training. To increase your skills in analysis and interpretation of the information you collect To increase your ability to report your findings to a variety of audiences To learn how to make evaluation results actionable.

amaliaw
Download Presentation

Data Analysis for Evaluation Eric Graig, Ph.D.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Analysis for Evaluation Eric Graig, Ph.D.

  2. Purpose of this Training • To increase your skills in analysis and interpretation of the information you collect • To increase your ability to report your findings to a variety of audiences • To learn how to make evaluation results actionable

  3. Analyze Results Communicate Findings Use Findings for Program Improvement Three Steps to Data Analysis

  4. Keep it Simple • Aim for a systematic effort • Keep your audience in mind • Pay attention to the usability of your evaluation report

  5. Two Kinds of Data • Quantitative • Qualitative

  6. Quantitative Data • Data that is numerical, counted, or compared on a scale • Demographic data • Answers to closed-ended survey items • Attendance data • Scores on standardized instruments

  7. Qualitative Data • Narratives, logs, experience • Focus groups • Interviews • Open-ended survey items • Diaries and journals • Notes from observations

  8. Summary Statistics • Measures of Central Tendency • Mean • Median • Range • Frequency distributions • Percentage distributions

  9. Frequency Distributions

  10. Percentage Distributions

  11. How did you first hear about the web-site? Court Referral Social Worker Friend or Acquaintance Librarian Web Search Engine Newspaper Story Other Graphing Frequency Data

  12. Means and Medians Mean = 87 Median = 94

  13. Some Subtleties Mean = 81 Mean = 87

  14. Histograms

  15. Cross Tabulations

  16. Satisfaction with Services 40 35 30 25 Satisfaction Score 20 15 10 5 0 A B C D E Clinic Name Graphing comparisons

  17. Some subtleties Satisfaction with Services 16 14 12 10 Staff Satisfaction Score 8 Advice Facility 6 4 2 0 A B C D E Clinic

  18. Satisfaction with Services 16 14 12 A 10 B Satisfaction Score 8 C D 6 E 4 2 0 Staff Advice Facility Satisfaction Component Some more subtleties

  19. A Note on Quantitative Data Entry • Several approaches • Paper and pencil tally • Word processing table • Spreadsheet • Custom database

  20. Qualitative Data • Textual data • Interview transcripts • Case notes/ clinical notes • Open-ended survey questions • Photographs • Video recordings

  21. Qualitative Analysis • Identify themes or patterns • Pre-set themes • Determine as you go • Tag text segments with themes

  22. Sample data sheet Site Interviewee Date Tape number Length Question number

  23. Exercise • Break into small groups • Using the sheet provided search for snippets of text that relate to the codes provided • Identify any other themes that may be important that are not provided on the sheet

  24. “Information that is not effectively shared with others will not be effectively used.” Source: Building a Successful Evaluation Center for Substance Abuse Prevention Communicating Findings

  25. Staff Board Funders Partners Other agencies Public Audience

  26. Reporting Formats • Written report • Data book • Flyer • Formal oral presentation • Round table with partners/ stakeholders • Poster • Web site

  27. Whatever strategy you choose: • Link findings to the program’s outputs and desired outcomes • Include successes and challenges • Support your claims with data • Acknowledge knowledge gaps

  28. Organizing Findings • Overview • Describe program/ describe evaluation • Program Implementation • What did you do, and how well did you do it? • Program Outcomes: • What were the changes that occurred? • Lessons learned • Sub-group differences

  29. Overview Section • Your organization • Your program • Purpose of the evaluation • Evaluation design • Structure of the report

  30. Program Implementation • Characteristics of program participants • Demographics • Outputs • Number of sessions held • Number of participants served • Partnerships formed • Satisfaction with program

  31. Program Outcomes • What has changed? • What do participants now know? • What are they now able to do? • What do they actually do? • How are there lives different?

  32. Lessons Learned • Which of the program’s assumptions were validated, which need to be reconsidered? • Does the program affect different kinds of people differently? • Are results better with some groups? • How will findings be addressed? • What new hypotheses did the evaluation generate? • How did the evaluation work?

  33. Additional Resources • One-on-one technical assistance • Synchronous course • Asynchronous material • InnoNet website www.innonet.org

More Related