1 / 23

Evaluations and their Role in Grant Writing

Evaluations and their Role in Grant Writing. Presented by: Nina Gottlieb, Ph.D. Stephanie Wexler-Robock, Ph.D. Some grant making realities. Fewer resources Greater competition Greater focus on program outcomes and effectiveness. What is an Evaluation?.

brody
Download Presentation

Evaluations and their Role in Grant Writing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluations and their Role in Grant Writing Presented by: Nina Gottlieb, Ph.D. Stephanie Wexler-Robock, Ph.D.

  2. Some grant making realities • Fewer resources • Greater competition • Greater focus on program outcomes and effectiveness Dynamic Research and Evaluation, LLC

  3. What is an Evaluation? • Program evaluation is simply a systematic method for collecting, analyzing, and using information to answer questions about your program. • A good evaluation tells you about your program’s outcomes. Dynamic Research and Evaluation, LLC

  4. Outcomes are……… • a program’s impact or value. Outcomes go beyond numbers of people served or participating in a program or initiative to look at the changes that have occurred as a result. • Outcomes can be…… • Improved attitudes • Increased knowledge • Changes in behavior • Financial gain • Improved condition • Preservation of environmental resources Dynamic Research and Evaluation, LLC

  5. Why evaluate your program? • Program evaluation is a valuable tool for organizations seeking to identify and measure outcomes and effectiveness, strengthen the quality of their programs, and improve outcomes for those they serve. • A well-done evaluation can find out “what works”, “what doesn’t work” and why. It provides continuous feedback to inform program improvement. • Program evaluation can highlight a program’s effectiveness to the community, funders, and potential funders. Dynamic Research and Evaluation, LLC

  6. Benefits of Evaluation • A program evaluation can provide critical information to improve program services and activities, inform future training, and align program goals with realistic implementation strategies. • Evaluation gives program developers and staff valuable information on which to base decision-making about program services. • Evaluations provide accountability. Dynamic Research and Evaluation, LLC

  7. What you need to start • You can think about a program in terms of inputs, process, outputs, and outcomes. • Inputs are the resources need to run the program ($, facilities, program staff, clients, etc.) • The process is how the program is carried out (e.g., how children are cared for, clients served, art is created, etc.) • Outputs are the units of service, such as number of customers/clients, things produced, etc. • Outcomes are the impacts on customer/clients receiving services (e.g., better academic outcomes, richer artistic appreciation, improved quality of life, etc.) Dynamic Research and Evaluation, LLC

  8. Types of Evaluations • Process or formative evaluations examine the extent to which a program is operating as intended and/or the quality of the implementation. • Outcome or summative evaluations investigate whether changes occur for program participants and the extent to which these can be attributed to the program activities. When the cook tastes the soup, that’s formative; when the guests taste the soup, that’s summative. Bob Stake, quoted in Scriven, 1991, p. 169. Dynamic Research and Evaluation, LLC

  9. Evaluation Methods • Quantitative: Data in the form of numbers and statistics • Numerical information (number of people, frequency of services) • Surveys • Analysis of test scores or other measures of behavior • Demographics (data on program participants or contexts) • Qualitative: Data reflect how people experience a program or activity • Focus groups • Interviews • Observations • Mixed Methods • Combination of quantitative and qualitative methods Dynamic Research and Evaluation, LLC

  10. When is the “best” time to conduct a formative or a summative evaluation? • Programs can begin thinking about evaluation in any phase of a project. • Programs just starting out can benefit from considering and crystallizing their goals and objectives, identifying outcomes, and ensuring that they are aligned with program services and activities. • Pilot programs can use evaluation as an opportunity to get information on program activities; which activities have been implemented, how, under what circumstances, etc. This information is extremely valuable to collect prior to expanding the pilot. • Established programs can use formative and summative techniques and data collected together to “check-in” on how their programs are being implemented and link the various ways the program is being implemented with project outcomes. Dynamic Research and Evaluation, LLC

  11. Can you do an summative evaluation without a formative evaluation? • Case Study: New York Road Runners Foundation Mighty Milers Program Evaluation • While programs may want to jump right into finding out how effective they are, this evaluation strategy can easily backfire. Dynamic Research and Evaluation, LLC

  12. The Value of Formative Evaluations • Regardless of how simple or clear cut you may think a program is, its implementation in the real world is likely to be somewhat different from its original vision. • Looking more closely at program implementation can give you valuable information about how the program is actually working, whether the intended target population is being reached, and the major challenges and successful implementation strategies. Dynamic Research and Evaluation, LLC

  13. The Value of Formative Evaluations: Case Study of New York Road Runners Foundation Mighty Milers Program……….. or “But We Don’t Have a Gym!” • The NYRRF described their Mighty Milers program as a simple, straightforward program for elementary school students designed to instill the daily habit of running and/or walking for physical fitness and overall health. Their original intention was to conduct a summative evaluation only. • We found, however, that the program was anything but simple in the real world. • There were substantial differences in where students ran (outside, in a school gym, around a classroom, in the lunchroom, on the roof, at a local park), how long they ran (anywhere from 5 minutes to 45 minutes), how their mileage was tracked (by students themselves, by teachers, estimated as a group, etc.) and a host of other issues. Dynamic Research and Evaluation, LLC

  14. The Value of Formative Evaluations: Case Study of New York Road Runners Foundation Mighty Milers Program • Using formative evaluation techniques, we uncovered a range of implementation models of this “simple” program. NYRRF realized how many different variations of the program existed in the schools they served compared with their original vision. • The formative data we provided generated a self-evaluation for the foundation and its staff, allowing them to revisit their original goals and objectives based on a more accurate and realistic picture of the strengths, challenges, and context of their program. • NYRRF used this information to help revamp their training materials to better prepare people to implement the Mighty Milers program under the many different circumstances that existed in the schools with which they worked. • The organization also reorganized their staff to provide more “hands-on” assistance to schools to successfully implement the program. Dynamic Research and Evaluation, LLC

  15. The Value of Formative Evaluations: Linking Process with Outcomes for a Dynamic Evaluation • Examining the details of program implementation is also critical to linking outcomes with implementation strategies and models. • Without evaluating how a program is actually working in the ‘real world’ it is difficult to identify the most effective strategies for your program or to fine-tune the program to have the greatest impact. Dynamic Research and Evaluation, LLC

  16. Identifying Program Outcomes • One of the most critical steps in conducting a program evaluation is to identify the particular outcomes the program hopes to achieve. • These outcomes are specific to each program and should be a natural extension of the program’s goals, objectives, and activities or services. Dynamic Research and Evaluation, LLC

  17. Steps in Identifying Program Outcomes and Evaluation Questions • #1: Reflect on the program’s mission and purpose and what impacts you want to have on the population you serve. In essence, why are you doing what you’re doing? And what do you want to happen as a result? Typically, outcomes and evaluation questions are related to changes in attitudes, knowledge, behavior, or skills. • #2: Choose the outcomes you want to examine. You may not have the resources to examine all of the outcomes at one time, so you may want to prioritize them. Dynamic Research and Evaluation, LLC

  18. Steps in Identifying Program Outcomes: Indicators • #3: For each outcome, identify what measurable indicators will tell you that you are achieving that outcome. What observable or measurable behaviors, perceptions, attitudes, or knowledge will your target group achieve as a result of your program’s activities. What difference would we see in this group compared to a group who were not in your program? There might be several indicators for each outcome. • #4: Work backwards to make sure that for each indicator, your program is actually providing services that make it likely that the indicator will show improvement or change. Dynamic Research and Evaluation, LLC

  19. Identifying Outcomes: NYRRF • Examples of some outcomes generated from research questions for the evaluation of the NYRRF Mighty Milers program included: impacts on students’ attitudes about physical activity and fitness; changes in participants’ behaviors regarding health and physical activity since starting the program; and perceived benefits of running/walking. Dynamic Research and Evaluation, LLC

  20. The Value of Summative Evaluations • Helps you to determine whether a fully developed program is meeting its objectives. • Summative evaluations examine whether, to what extent, and in what direction outcomes change for those in the program. Dynamic Research and Evaluation, LLC

  21. Internal versus External Evaluations • Both internal evaluations (those conducted by program staff) and external evaluations (those provided by professionals outside the program) have advantages and disadvantages. • Internal evaluations can be a less expensive option, and the program staff has detailed knowledge of the program being evaluated. Disadvantages include the staff not having enough evaluation experience to conduct a proper evaluation (lack of credibility); a lack of objectivity and credibility, especially in the view of potential funders; inhibiting honesty of other staff or “clients” who feel they cannot express themselves forthrightly to someone they know and will see again; and a lack of time for staff to conduct an evaluation. • External evaluators can be more expensive and need to spend some time getting to know the details of a program. On the other hand, external evaluations offer greater credibility with the “outside world”; experienced evaluators bring efficiency and technical expertise to the program and the evaluation; and external evaluators have experience writing evaluation reports for diverse audiences. Dynamic Research and Evaluation, LLC

  22. Reporting Your Results • Evaluation results can be reported in a variety of ways. • Interim reports provide you with a feedback on program activities and/or data analyzed during the course of an annual evaluation. • Final reports generally are in-depth, cumulative reports that include results of all data collected during the evaluation period. • Executive summaries highlight critical findings in a brief format and can be used for wide distribution to boards of directors, potential funders, media, and other interested parties. Dynamic Research and Evaluation, LLC

  23. Our Evaluation Philosophy • DRAE believes strongly in collaborative evaluations that are used as dynamic learning tools, rather than static critiques. Close partnerships with clients enable us to effectively provide the ongoing information key stakeholders need about programs, services, and outcomes to highlight best practices as well as strengthen areas that could use fine-tuning. • Working closely with clients enables us to help programs articulate what they would like to gain from an objective study of their programs and services. We work with clients to help articulate goals, operationalize program objectives, and address the needs of all stakeholders and audiences. An effective evaluation should be able to address the diverse needs of your program staff, from specific, ongoing information related to program implementation and impact to the succinct capturing and presentation of findings to show potential funders. DRaE Dynamic Research and Evaluation, LLC

More Related