1 / 135

Basics of Evaluation: Parks and Recreation

Basics of Evaluation: Parks and Recreation. Karla A. Henderson, Ph.D . Professor, North Carolina State University karla_henderson@ncsu.edu. Framework. Evaluation Inventory (handout) Needs Assessment Regarding Program Evaluation ( Types of Needs Explained)

chava
Download Presentation

Basics of Evaluation: Parks and Recreation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Basics of Evaluation: Parks and Recreation Karla A. Henderson, Ph.D. Professor, North Carolina State University karla_henderson@ncsu.edu

  2. Framework • Evaluation Inventory (handout) • Needs Assessment Regarding Program Evaluation (Types of Needs Explained) • Small Group Needs Identification (6 people per group) Importance (Very, Moderately, Slightly, Not) 5-10 topics to Discuss Together

  3. What is Evaluation? Evaluation= systematic collection and analysis of data to address criteria to make judgments about the worth or improvement of something; making decisions based on identified criteria and supporting evidence Assessment= the examination of some type of need that provides a foundation for further planning Evaluation is sometimes like solving a mystery

  4. Types of ProgramEvaluation Macro (System Evaluation) Micro (Activity or Event Evaluation)

  5. Formal (Systematic) Evaluation MACRO and MICRO approaches------ • Provides rigor • Systematic gathering (procedures and methods) of evidence • Leads to decisions and action • Criteria • Evidence • Judgment

  6. Assessments • Why assess? • eventually use the input to design programs and objectives • generates new program ideas • gives constituents a say • helps you be responsive • What is a need? A want? An intention? • Assessments determine all three of these so YOU can figure out how to promote what you are doing

  7. Types of needs (Caution!) • Normative needs- what should be available • Felt needs- what individuals believe they would like • Expressed needs- needs fulfilled through participation

  8. 1. Why do Systematic Evaluation

  9. Potential Purposes: • Efficiency-How are we doing? (Management /Process Focused) • Effectiveness-What difference do our efforts make? (Impact/Outcome Focused)

  10. Accountability Era—DO YOU AGREE? • What gets measured gets done • If you don’t measure results, you can’t tell success from failure • If you can’t see success, you can’t reward it • If you can’t reward success, you’re probably rewarding failure Reinventing Government, Osborne and Gaebler, 1992 University of Wisconsin-Extension, Program Development and Evaluation

  11. Accountability Era • If you can’t see success, you can’t learn from it • If you can’t recognize failure, you can’t correct it. • If you can demonstrate results, you can win public support. Reinventing Government, Osborne and Gaebler, 1992 University of Wisconsin-Extension, Program Development and Evaluation

  12. Evaluation Process: “Begin with the end in mind.” Covey (1990), 7 Habits of Highly Effective People

  13. 2. How to Think Like an Evaluator

  14. A thinking process used by evaluators: (from Richard Krueger) • Reflecting • Develop a theory of action. A logical sequence that results in change. • Begin with what is supposed to happen--the results. • Listening • Share the theory of action with others. • Measuring • Determine your measurement strategies--how you're going to look at the program • Adding value to the program • What can evaluation do to contribute to the program? How can evaluation make the program better, more enjoyable, focused on results, accountable, satisfying to participants and educators, etc?

  15. Ways of Thinking: • Goal Based Thinkers - "We look for goals" • Audit Thinkers -"We investigate and find out what's wrong" • Utilization Focused Thinkers - "We make evaluation useful" • Empowerment Focused Thinkers - "We empower local people" • Positivistic Thinkers - "We are scientists" • Number Thinkers - "We count--and we do it well“ • Qualitative Thinkers - "We tell stories" (Richard Krueger)

  16. Practical tips for successful evaluation: (Richard Krueger) • Involve others Utilization, impact and believability emerge from involving colleagues and clientele. If you want the information used then involve others! • Ask yourself: Do I have a program? and, Is it worthy of evaluation? • Consider your purpose for evaluating---- (see earlier slide) • Consider who wants the evaluation-Who requested it? • Use a variety of evaluation methods when possible. • Keep costs low by: Sampling strategically • Keep interest high by adding payoff to the participant. • Start with goals, but don't be unduly limited by goals • Consider "early evaluation" • Design the evaluation carefully. The evaluation should: • Enhance the program • Yield information beneficial to stakeholders • Conserve resources

  17. 3. Differences Between Assessment, Evaluation, and (Action) Research

  18. Evaluation= systematic collection and analysis of data to address criteria to make judgments about the worth or improvement of something; making decisions based on identified criteria and supporting evidence Assessment= the examination of some type of need that provides a foundation for further planning Action Research-Evaluation leading to decisions/changes

  19. 4. Steps Involved in Evaluation Process

  20. Steps • Problem, Idea Identified • Problem Statement/Purpose Determined • Instrument/Method Chosen • Data Sources • Data Collection • Data Analysis • Conclusions/Recommendations

  21. Evaluation Process: “Begin with the end in mind.” Covey (1990), 7 Habits of Highly Effective People

  22. 5. What Should be Evaluated

  23. Areas to Evaluate • Personnel • Places • Policies • Programs • Participant Outcomes

  24. Potential Purposes: • Efficiency-How are we doing? (Mgmt/Process Focused) • Effectiveness-What difference do our efforts make? (Impact/Outcome Focused)

  25. Levels of Evaluation: • END RESULTS (Impact) • PRACTICE CHANGE (Outcomes) • KASA CHANGE (Knowledge, attitudes, skills, and aspirations)(Outcomes) • REACTIONS (SATISFACTION) (Outputs) • PEOPLE INVOLVEMENT (Outputs) • ACTIVITIES (Outputs) • INPUTS--RESOURCES

  26. What is sampling? • A population is the theoretically specified aggregation of study elements. • A sample represents or is representative of a population

  27. Types of Sampling • Probability • Non-Probability • Theoretical

  28. Probability • Probability samplingSamples are selected in accord with probability theory, typically involving some random selection mechanism. Random Stratified Random Systematic Cluster • RepresentativenessQuality of a sample having the same distribution of characteristics as the population from which it was selected.

  29. Nonprobability • Technique in which samples are selected in a way that is not suggested by probability theory. Purposive Convenience Quota Expert Snowball

  30. 6. Who Should Conduct Evaluations

  31. WHO DOES THE EVALUATIONS? • Internal • You! • Staff • Agency Evaluation Personnel • External • Consultants • University Students! Regardless—YOU have to know your purpose, goals, and appropriate methods!

  32. 7. When Should Evaluataions Be Done

  33. Timing • Assessments (planning) – find out where to begin based on what you know • Formative (process)- concerned with efficiency and effectiveness • Summative (product) – overall performance

  34. Approaches to Needs Assessments • Literature/Professional Development • Advisory Groups • Structured Interviews (individual and focus groups) • Surveys

  35. Formative Evaluation • Evaluation in Process • Allows Changes to be Made Immediately • Most often Focused on Inputs and Outputs

  36. Summative Evaluation • At the END of something • “What was?” • Recommendations for the Future

  37. 8. What Tools are Available for Evaluations

  38. Data Collection Methods MAJOR ONES: • Questionnaires/Surveys • Interviews (Individual and Focus Groups) (Pros and Cons)

  39. Other Methods: • Systematic Observation • Checklists • Field Observations • Unobtrusive Measures • Physical Evidence • Archives • Covert Observations • Visual Analyses • Experimental Designs • Case Studies

  40. 9. What “Cutting Edge” Strategies Exist for Evaluation

  41. To consider (see below for further info) • Logic Models—”outcome focused” • Trends Analysis • Benchmarking • Proragis (NRPA)

  42. 10. CAPRA Standards related to Evaluation

  43. CAPRA Standards • 10.1 Systematic Evaluation Program There shall be a systematic evaluation plan to assess outcomes and the operational deficiency and effectiveness of the agency. • 10.2 Demonstration Projects and Action Research There shall be at least one experimental or demonstration project or involvement in some aspect of research, as related to any part of parks and recreation operations, each year.

  44. CAPRA Standards • 10.3 Evaluation Personnel There shall be personnel either on staff or a consultant with expertise to direct the technical evaluation/research process. • 10.4 Employee Education There shall be an in-service education program for professional employees to enable them to carry out quality evaluations.

  45. 11. Evaluating Inputs, Outputs, Outcomes, and Impacts

  46. Evaluation Approaches—KEY POINTS! • Multiple LEVELS of evaluations: • Inputs (costs, personnel, etc.) • Outputs • Activities • People involvement • Reactions • Outcomes • KASA-knowledge, attitudes, skills, aspirations • Behavior CHANGE • Impacts • Long-term Benefits

  47. 12. Using Goals and Objectives as Basis for Evaluation

  48. What are the goals of the program? • What do we expect to happen? • What do we want participants to do, gain, learn? BEGIN WITH THE END IN MIND!!

  49. Goals and Objectives • Goals • Broad, long-range statements that define the programs/services that are going to be provided • Objectives • Specific statements (about the attainable parts of the goal) that are measurable and have some dimension of time.

  50. Objectives • Specific • Must be clear and concrete • Measurable • Must be some way to determine whether or not the desired results have been achieved • Achievable • Must be attainable and reality-based!!! • Relevant • Must be useful; must have worth to your organization • Time-limited/Time connected • Must specify a time frame for accomplishment Adapted from Edginton, Hanson, & Edginton, 1980

More Related