1 / 19

SB81/ROCK What Makes Evaluation So Important?

California Probation Parole and Correctional Association 77th Annual Training Conference. SB81/ROCK What Makes Evaluation So Important?. September 18, 2008 Prepared by Urban Strategies Council Evaluation Team Steve Spiker, Director of Research Bill Heiser, Research and Program Associate

elin
Download Presentation

SB81/ROCK What Makes Evaluation So Important?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. California Probation Parole and Correctional Association 77th Annual Training Conference SB81/ROCKWhat Makes Evaluation So Important? September 18, 2008 Prepared by Urban Strategies Council Evaluation Team Steve Spiker, Director of Research Bill Heiser, Research and Program Associate Junious Williams, CEO www.urbanstrategies.org

  2. Outline • Presenter Biographies • The Council • What is Program Evaluation • Why Evaluate? • The ROCK Program • Evaluation Outcomes

  3. Urban Strategies Council • Non-profit founded in 1987 in Oakland, California • Mission is the elimination of persistent poverty by building vibrant, healthy communities • Operating Programs: • Economic Opportunity • Education Excellence • Community Safety & Justice • Support Programs: • Research & Technology • Community Capacity Building

  4. Community Safety & Justice Current Projects • Alameda County Reentry Network (www.acreentry.org) • Reentry Health Gap Survey • Oakland Community Policing • Oakland Crime & Homicide Analysis • Alameda County Violence Prevention Initiative • Community Service Gateway Reentry Project Past Projects • Reentry Health Task Force • Richmond Violent Crime Analysis

  5. What is Program Evaluation • Formalized approach to studying processes and impacts • Include both quantitative and qualitative methods • In more recent years evaluation focuses on utility, relevance and practicality and less on scientific validity • Focus points: • What do you need to know? • How can it be applied?

  6. What is Program Evaluation • 5 dimensions: • Needs Assessment (Why do an evaluation, what must we know?) • Program Theory (How does the program work?) • Process Analysis (Formative Evaluation) • Impact Analysis (Goals met?) • Cost-Benefit & Cost-Effectiveness analysis (Is what we did scalable beyond a pilot?)

  7. Why Evaluate? • Program improvement • Comparison between programs • Demonstrate measureable impact of new programs • Decision making- are there particular groups who will most benefit? • Independent verification of program suitability and success

  8. Evaluation Considerations • Why are we thinking of an evaluation? • Who is the evaluation for? (Audience) • What type of information will we need? (Data) • Goals, processes or outcomes? • Evaluators can help improve program design • Evaluators can assist in building skills, knowledge and abilities of staff • Include them as part of your project team, not as outsiders

  9. Evaluation Issues • Are you asking the right questions? • There may be multiple “right” answers • Is there a broader context you need to consider? • Need to include stumbling blocks and failures! • Consider assumptions in model/theory • How well can we generalize our results?

  10. When to hire an evaluator? • Ideally at the proposal-writing stage! • Preferably before you start providing services • Never wait till the first annual reports are due!

  11. SB81/ROCK Evaluation • Formative Evaluation • Data Collection & Instrument Design • Data Analysis • Causation & Recommendations

  12. Population Characteristics • Total probationers 18-24 in Oakland/Hayward: 264 • 228 Male (84%), 36 Female (14%)

  13. SB81/ROCK Evaluation • Performance goals to be measured: • Enrollment in recommended programs and services; • Participation in the required or recommended services; • Acquisition of the intended knowledge or skills; • Completion of the program; and • Achieving the intended purpose of the program such as GED, employment, cognitive behavioral change, effective anger management, reduced and more effectively managed conflict, etc.

  14. Formative Evaluation • 6 Months in length • Refining program definitions, hypotheses and procedures • Finalize specific outcome and results measures • Clarify and validate selection procedures • Baseline data analysis

  15. Process Documentation • In order to evaluate a new program we must know (i.e. document) the following: • Procedures and processes: what happens in certain situations • Referrals: how, when and why they are made • Theory of change- why each part will have an impact • Contextual and environmental conditions affecting the program implementation and outcomes

  16. Data Collection • Full probation population demographics • LS/CMI Assessment scores • Referrals: when, by whom, why and how well they worked • New arrests and convictions • Probation violations, revocations • Program removal- moving out of county, death • Surveys from pre/post cognitive behavioral classes • Interviews with DPOs on program implementation and processes

  17. Evaluation Outcomes • Causation: are the outcomes for the target group statistically different from those in the comparison group? • Can we definitively say that this model of services reduces recidivism to state prison? • If yes, then how well does this model suit the general probation population in Alameda County (or the State?)

  18. Follow Up • For more information on: • The ROCK evaluation.. • Other evaluation services.. • Mapping services or spatial analysis.. • Analytical services.. • Program design.. Please contact us: Steve Spiker, Research Director steves@urbanstrategies.org or 510-893-2404

More Related