Valparaiso University The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006
Today’s Agenda • Project components • Defining outcomes • Outcomes based planning • Outcomes vs. outputs • Identifying your audience • Setting outcomes to drive planning • Making outcomes measurable
Traditional Project Components • Inputs (Money, staff, volunteers, facilities, equipment, supplies) • Activities (Strategies, techniques, types of treatment) • Outputs (direct products, classes taught, educational materials distributed)
Added Project Components • Inputs • Activities • Outputs • Outcomes (including indicators and data sources)
Why Measure Outcomes? The “So What” factor As funding becomes more scarce, service providers, governments, other funders, and the public are calling for clearer evidence that the resources they expend actually produce benefits for people. Funders want greater accountability for the use of their resources.
Defining Outcomes • Outcomes are the BENEFITS realized as a result of the project. • They represent the IMPACT of the project on its participants. • They typically represent a CHANGE in the behavior, attitude, skills, status, knowledge, or life condition of the project participants.
Outcome Based Planning Outcome based planning uses audience needs and hoped-for results as the foundation for programs and design decisions
Outcome Based Evaluation Outcome based evaluation is a systematic way to assess the extent to which a program has achieved its intended results. It focuses on the key questions • “How has my program made a difference?” • “How are the lives of the program participants better as a result of my program?”
Benefits of Outcome Based Planning and Evaluation • Provides a logical framework for program development • Increases program effectiveness • Communicates program value • Generates information for decision-making
Limitations of Outcome Based Planning and Evaluation • Not identical to formal research • Suggest cause and effect, it doesn’t intend to prove it • Shows contribution, not attribution
Outcomes Checklist • Are the outcomes related to the “core business” of your program? • Is it within your control to influence the outcomes? • Are your outcomes realistic and attainable? • Are your outcomes achievable within the funding and reporting periods?
Outcomes Checklist • Are your outcomes written as change statements—will things increase, decrease, or stay the same? • Have you moved beyond client satisfaction in your outcomes? • Is there a logical sequence among your short-term, intermediate, and long-term outcomes? • Are there any big “leaps” in your outcomes (i.e. gaps in the progression of impacts)?
Activities and services leading towards intended outcomes Have a definite beginning and end Design it to change attitudes, behaviors, knowledge, or increase skills and abilities based on assumed need Develop a Program
Audience Programs are developed as a result of assumptions about people’s needs Information can be drawn from: • Your experiences • A program partner’s experiences • Formal or informal research
Audience Needs NEED: A condition, want, deficit that is common to a group of individuals SOLUTION: A program that will change or improve behaviors, knowledge, skills, attitudes, life condition, or status DESIRED RESULTS: The change or improvement you expect to achieve
Process NEEDS SOLUTIONS RESULTS • Getting to evaluation of outcomes: • What are the indicators? • What are the data sources?
Results • Based on audience success • That they know something or will be able to do something • They can be measured at various times (i.e. Initial, Intermediate, Long Term) • Targets can be established for each period
Reports Reports summarize the results of outcome data and include: • Participant characteristics • Inputs, activities & services, outputs and outcomes • Elements requested by funders • Comparisons of previous periods • Interpretation of data
Evaluate the System After a reasonable period of operation, evaluate the effectiveness of the outcomes system: • Have audiences been sufficiently identified? • Are outcomes clearly written? • Are outcomes sufficient to describe what you hope will happen? • Are data collection methods cost efficient?
Coming Attractions September 21, 2006: “Introduction to Grantseeking on Campus” (repeat) October 19, 2006: “Federal Grantseeking” November 16, 2006: “Collecting Data for Grantseeking”