Evaluation options and pitfalls
This presentation is the property of its rightful owner.
Sponsored Links
1 / 16

Evaluation: Options and Pitfalls PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Evaluation: Options and Pitfalls. Cache Steinberg, Ph.D., LCSW. Evaluation answers questions. Did we provide the service the way we said we would? Did people change? Did the intervention influence/cause the change? Did the change last?. Useful evaluation. Begins with program design.

Download Presentation

Evaluation: Options and Pitfalls

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Evaluation options and pitfalls

Evaluation:Options and Pitfalls

Cache Steinberg, Ph.D., LCSW

Evaluation answers questions

Evaluation answers questions

  • Did we provide the service the way we said we would?

  • Did people change?

  • Did the intervention influence/cause the change?

  • Did the change last?

Useful evaluation

Useful evaluation

  • Begins with program design.

  • Includes input from program staff.

  • Answers important questions about the program.

  • Is used to improve the program!!!!

Logic model

Logic Model

Influencing factors

Influencing factors



  • Grant writer, evaluator, and program staff have the same, clear understanding of process and expected outcomes.

  • Objectives can be stated easily.

Types of evaluation

Types of evaluation

  • Process

  • Outcome

  • Efficiency

  • Comprehensive

Elements of an outcome evaluation plan

Elements of an outcome evaluation plan

  • Describe the design.

  • Identify what will be measured.

  • Describe data collection plan: type of data, source, collecting procedures, & timetable.

  • Sampling plan.

  • Analysis techniques.

  • Protection of human subjects.

Pitfall 1 too many objectives

Pitfall #1: Too many objectives

  • Pick your objectives thoughtfully.

    • Key processes

    • Key immediate and intermediate outcomes

Pitfall 2 vague slapdash objectives

Pitfall #2: Vague/slapdash objectives

  • Wording dictates the design of the outcome evaluation.

  • Measurable.

  • One outcome or process per objective.

Pitfall 3 unreliable measurement

Pitfall #3: Unreliable measurement

  • Don’t make up your own instruments unless absolutely necessary.

  • Pick instruments that have high reliability and validity.

  • Pick instruments that have a strong link to what your program proposes to change.

Pitfall 4 unrealistic benchmarks

Pitfall #4: Unrealistic benchmarks

If you must set numerical benchmarks, base these on fact :

  • Historical data from your agency.

  • Reports in the literature.

Pitfall 5 evaluation burden

Pitfall #5: Evaluation burden


  • Use information you already gather (school grades, arrests, GAF score, etc.).

  • Integrate measurement into the therapeutic process.

Contact information

Contact Information

Cache Steinberg, Ph.D., LCSW

Office of Community Projects

Graduate School of Social Work, U. of H.


[email protected]

  • Login