1 / 28

Evaluation Plans & Performance Indicators

Evaluation Plans & Performance Indicators. Office of Research, Evaluation, and Policy Studies Marcella M. Reca Zipp November 30, 2010. Necessity of evaluation plan Types of evaluation plans Components of an evaluation plan Performance indicators Reporting requirements

naeva
Download Presentation

Evaluation Plans & Performance Indicators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Plans &Performance Indicators Office of Research, Evaluation, and Policy Studies Marcella M. Reca Zipp November 30, 2010

  2. Necessity of evaluation plan • Types of evaluation plans • Components of an evaluation plan • Performance indicators • Reporting requirements • Sample evaluation plans

  3. Purpose of Evaluation Plan • Cohesive approach to conducting evaluation and using results • Explains what, when, how, why, who • Documents the evaluation process • Ensures implementation fidelity • Promotes a participatory approach Source: University of Toronto

  4. Source: University of Toronto

  5. 3 Levels of Evaluation • Project-Level Evaluation • Context • Implementation • Outcome • Cluster Evaluation • Program and Policymaking Evaluation

  6. Project-Level Evaluation • Context • Needs, assets, and resources of community • Political atmosphere, social and environmental strengths/weaknesses • Implementation • Critical components/activities of project • Aspects that are strengths and weaknesses • How do the components connect to goals and outcomes • Outcome • Critical outcomes you are trying to achieve • Impact on clients, community, etc. • Unexpected impact

  7. Cluster Evaluation • Determines how well the collection of projects fulfills the objective of systemic change. • Not a substitute for project-level evaluation. • Looks across a group of projects to identify common themes. • Information reported in aggregate form to granting organization.

  8. Program and Policy Making Evaluation • Macro form of evaluation. • Utilizes information gathered from both project-level and cluster evaluation to make effective decisions about program funding and support. • Supports communities in creating policy change at local, state, and federal levels.

  9. Elements of an Evaluation • Introduction • Project Objectives • Logic Model • Partnership Roles and Responsibilities • Intervention Programming/Research • Methodology/Data Collection • Instrumentation Measures

  10. Introduction • Provides background information for the evaluation, identifies purpose and goal, sets the course on the evaluation road map. • Evaluation purpose and goals • What does the evaluation strive to achieve? • Evaluation team • Who is the evaluation coordinator? • Who are the members of the evaluation team?

  11. Logic Model • Graphic depiction of the program description. • Links needs, objectives, activities, measurements. • Provides scope of program. • Ensures systematic decisions are made about what will be measured. • Identifies and organizes indicators.

  12. Objectives • PIMO method • The number of objectives will be determined by purpose (i.e., intervention, treatment, prevention). • Interrelated to your projected activities (i.e., education, service, research). • Feasible to collect, will provide accurate results.

  13. Partnership Roles and Responsibilities • Project partners are expected to provide certain, unique expertise to your project activities, either in direct service function or as advisory units. • Identify each partner in their role and responsibility in terms of involvement for your project. • Partner-cited activities must be evaluated formatively and summatively.

  14. Intervention Programming • Identify one or more intervention strategies used to support project activities and anticipated outcomes. • Cite if the program is on the federal evidence-based initiative list: EBI • For market-available programs that require training and certification of direct service providers, provide a timetable for acquiring training before intervention can be used.

  15. Performance Indicators • Visible, measurable signs of program performance. • Relevant, understandable and useful. • Reflect program objectives, logic model and evaluation questions. • Define success • Reasonable expectations of program performance. Source: University of Toronto

  16. Performance Indicators cont. • Other terms – industry jargon • Key Performance Indicator (KPI) • Performance metric • Performance standard • Balanced Scorecard • Quality indicators • All are different words for the same thing: measure performance.

  17. Data Collection • What methods will be used? • How often will data be collected? • Who will collect the data? • Validity and reliability of data sources • Baseline data • Outcomes-based triangulation • Quality assurance • Design (experimental, quasi-experimental, etc.)

  18. Instrumentation Measures • Tools for data collection • Only collect the information you need • Easy to administer and use • Pilot test tools before use in the evaluation • Human Subjects Considerations • IRB, school board approval • Data management and storage • Confidentiality and data quality

  19. Tips & Helpful Hints • Be realistic • In your assessment of resources • In your timeline • Seek help • Use templates, tables, or guides that may be provided in the RFP or model after past funded proposals.

  20. Reporting and Dissemination • Dissemination • How will you disseminate findings? • Who is responsible? • How, where, when will findings be used? • Reporting • Formative reports – quarterly, biannually • Summative reports – final report/end of project • Project deliverables

  21. Sample RFP Evaluation Plan • Two examples of an evaluation plan within an RFP • General, limited specifications • Complex, very detailed

  22. Evaluation Resources • CDC: www.cdc.gov/eval • University of Toronto: www.utoronto.ca/shp/hcu • W.K. Kellogg Foundation: www.wkkf.org/Publications/evalhdbk • Connell, J.P., Kubisch, A.C., Schorr, L.B., Weiss, C.H. (1995). New Approaches to Evaluating Community Initiatives, New York, NY: Aspen Institute. • Shadish, W.R., Cook, T.D., Leviton, L.C. (1991). Foundations of Program Evaluation. Newbury Park, CA: Sage Publications. • Taylor-Powell, E., Steele, S., Douglas, M. (1996). Planning a Program Evaluation. Madison, WI: University of Wisconsin Cooperative Extension.

More Related