html5-img
1 / 22

Evaluation (a.k.a. Assessment)

Evaluation (a.k.a. Assessment). Krista S. Schumacher Schumacher Consulting.org 918-284-7276 krista@schumacherconsulting.org www.schumacherconsulting.org. Prepared for the Oklahoma State Regents for Higher Education 2010 Summer Grant Writing Institute. To Evaluate or to Assess? .

lethia
Download Presentation

Evaluation (a.k.a. Assessment)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation(a.k.a. Assessment) Krista S. Schumacher Schumacher Consulting.org 918-284-7276 krista@schumacherconsulting.org www.schumacherconsulting.org Prepared for the Oklahoma State Regents for Higher Education 2010 Summer Grant Writing Institute

  2. To Evaluate or to Assess? • Technically speaking…. • Assessment • Long-term outcomes, aggregated judgment • Evaluation • Short-term outcomes, “unique event” judgment krista@schumacherconsulting.org 918-284-7276

  3. Why Evaluate? • How will you know your project is progressing adequately to achieve objectives? • How will funders know your project was successful? • Increasing emphasis placed on evaluation, i.e., • U.S. Department of Education • National Science Foundation • Substance Abuse and Mental Health Services Administration (SAMHSA) krista@schumacherconsulting.org 918-284-7276

  4. Why Evaluate? • Improve the program – • “Balancing the call to prove with the need to improve.” (W.K. Kellogg Foundation • Determine program effectiveness – • Evaluation supports “accountability and quality control” (Kellogg Foundation) • Significant influence on program’s future • Generate new knowledge – • Not just research knowledge • Determines not just that a program works, but analyzes how and why it works • With whom is the program most successful? • Under what circumstances? krista@schumacherconsulting.org 918-284-7276

  5. Why Evaluate? WHAT WILL BE DONE WITH THE RESULTS????? “Evaluation results will be reviewed (quarterly, semi-annually, annually) by the project advisory board and staff. Results will be used to make program adjustments as needed.” krista@schumacherconsulting.org 918-284-7276

  6. Types of Evaluation Process evaluation: • What processes are used and how well do they work? Outcome evaluation: • Did the project achieve its stated objectives? krista@schumacherconsulting.org 918-284-7276

  7. Process Evaluation • What was provided and to whom? • services (modality, type, intensity, duration) • recipients (individual demographics and characteristics) • gender, age, race/ethnicity, income level, first-generation status • context (institution, community, classroom) • cost (did the project stay within budget?) • Do processes match the proposed project plan? • What types of deviation from the plan occurred? • What led to the deviations? • What effect did the deviations have on the project and evaluation? krista@schumacherconsulting.org 918-284-7276

  8. Outcome Evaluation • What effect did the program have on participants? • Activities / Objectives • Achievement / Attitudes and beliefs • What program/contextual factors were associated with outcomes? • What individual factors were associated with outcomes? • How durable were the effects? • What correlations can be drawn between outcomes and program? • How do you know that the program was the cause of the effect? krista@schumacherconsulting.org 918-284-7276

  9. Who will Evaluate? • External evaluators increasingly required or strongly recommended • Partners for effective and efficient programs • Methodological orientations • Philosophical orientations • Experience and qualifications krista@schumacherconsulting.org 918-284-7276

  10. How much will it cost? • External evaluations cost money…period. • Standard recommendation: • 5% to 10% of total budget • Kellogg Foundation; U.S. Dept of Ed. • Check funder limits on evaluation • Ensure cost is reasonable but sufficient krista@schumacherconsulting.org 918-284-7276

  11. Two Types of Data • Quantitative • Numbers based on objectives and activities • Types of data needed: • Number of participants (process) • Grade point averages (outcome) • Retention rates (outcome) • Survey data (outcome and process) • Qualitative • Interviews • Focus groups • Observation krista@schumacherconsulting.org 918-284-7276

  12. Methods/Instruments • How are you going to get your data? • Establish baseline data • Pre- and post-assessments (knowledge, skills) • Pre- and post-surveys (attitudinal) • Enrollment rosters • Meeting minutes • Database reports • Institutional Research Office (I.R.) krista@schumacherconsulting.org 918-284-7276

  13. Data Analysis • Quantitative data • Data analysis programs: • SPSS (Statistical Program for the Social Sciences), Stata, etc... • Descriptive and statistical data: • On a scale of 1 to 5, with 1 being “strongly disagree” and 5 being “strongly agree,” please indicate the extent to which you agree or disagree that pigs can fly. • Descriptive: • 150 (or 75%) of respondents agree or strongly agree that pigs can fly. • Statistical (t-test, ANOVA, etc.) • There is a statistically significant difference (p<.05) between the sexes that pigs can fly, with men more likely than women to agree or strongly agree with this statement. krista@schumacherconsulting.org 918-284-7276

  14. Data Analysis • Qualitative Data • Data analysis programs • NVivo , ATLAS.ti, etc… • More than pithy anecdotes • “May explain – and provide evidence of – those hard-to-measure outcomes that cannot be defined quantitatively.” – W.K. Kellogg Foundation • Provides insight into how and why a program is successful • Analyze for themes that support (or don’t) quantitative data krista@schumacherconsulting.org 918-284-7276

  15. Two Types of Timeframes • Formative • Ongoing throughout life of grant • Measures activities and objectives • Summative • At conclusion of grant funding • NEED BOTH! krista@schumacherconsulting.org 918-284-7276

  16. Timelines • When will evaluation occur? • Monthly? • Quarterly? • Semi-annually? • Annually? • At the end of each training session? • At the end of each cycle? krista@schumacherconsulting.org 918-284-7276

  17. Origin of the Evaluation: Need and Objectives Need: For 2005-06, the fall-to-fall retention rate of first-time degree-seeking students was 55% for the College’s full-time students, compared to national average retention rates of 65% for full-time students at comparable institutions (IPEDS, 2006). Objective: The fall-to-fall retention rate of full-time undergraduate students will increase by 3% each year from a baseline of 55% to 61% by Sept. 30, 2010. krista@schumacherconsulting.org 918-284-7276

  18. Evaluation Data Collection and Reporting Plan krista@schumacherconsulting.org 918-284-7276

  19. BEWARE THE LAYERED OBJECTIVE! • By the end of year five, five (5) full-time developmental education instructors will conduct 10 workshops on student retention strategies for 200 adjunct instructors. krista@schumacherconsulting.org 918-284-7276

  20. Logic Models • From: University of Wisconsin-Extension, Program Development and Evaluation • http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html • A Logic Model is…… • A depiction of a program showing what the program will do and what it is to accomplish. • A series of “if-then” relationships that, if implemented as intended, lead to the desired outcomes • The core of program planning and evaluation Situation Inputs Outputs Outcomes Hungry Get food Eat food Feel better krista@schumacherconsulting.org 918-284-7276

  21. Evaluation Resources • W.K. Kellogg Foundation – “Evaluation Toolkit” • http://www.wkkf.org/default.aspx?tabid=75&CID=281&NID=61&LanguageID=0 • Newsletter resource – The PEN (Program Evaluation News) • http://www.the-aps.org/education/promote/content/newslttr3.2.pdf • NSF-sponsored program • www.evaluatorsinstitute.com • American Evaluation Association • www.eval.org • Western Michigan University, The Evaluation Center • http://ec.wmich.edu/evaldir/index.html (directory of evaluators) • OSRHE list of evaluators and other resources • http://www.okhighered.org/grant%2Dopps/writing.shtml • “Evaluation for the Unevaluated” course • http://pathwayscourses.samhsa.gov/eval101/eval101_toc.htm krista@schumacherconsulting.org 918-284-7276

  22. Evaluation Resources • The Research Methods Knowledge Base • http://www.socialresearchmethods.net/ • The What Works Clearinghouse • http://www.w-w-c.org/ • The Promising Practices Network • http://www.promisingpractices.net/ • The International Campbell Collaboration • http://www.campbellcollaboration.org/ • Social Programs That Work • http://www.excelgov.org/ • Planning an Effective Program Evaluation short course • http://www.the-aps.org/education/promote/pen.htm krista@schumacherconsulting.org 918-284-7276 krista@schumacherconsulting.org 918-284-7276

More Related