1 / 27

Earmark Grant Evaluation: An Introduction and Overview

Earmark Grant Evaluation: An Introduction and Overview. Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com. May 19, 2005. Presentation Topics .

ama
Download Presentation

Earmark Grant Evaluation: An Introduction and Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Earmark Grant Evaluation: An Introduction and Overview Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com May 19, 2005

  2. Presentation Topics • The evaluation requirement for earmark grants • Evaluation overview – or – “Where’s the upside?” • Planning the evaluation • The evaluation process for earmark grants • Discussion We’ll do clarifications on the fly, broader discussion at the end.

  3. The Evaluation Requirement

  4. Each grantee must … • Conduct or commission an evaluation • Submit evaluation plan • Use the evaluation template • Submit evaluation report shortly after completion of project activities

  5. Evaluation Overview– or –“Where’s the upside?”

  6. Program evaluation is ... • The systematic collection of information about the subject of the evaluation • Used to make decisions about organization’s or program’s: • Creation • Improvement • Effectiveness

  7. Evaluation is a mindset … • We are all evaluators • Evaluation is continuous • Evaluation looks forward, not just backward • Involves organizational learning • Means people working together

  8. Evaluation allows you to examine ... • What’s working well • What is not • How to improve There is no bad news, only news!

  9. Evaluation requires comparison ... • of the same group over time • pre- and post-tests • trends in community-level data • of two comparable groups • at one point in time • over time • of your group to a larger group • county compared to state

  10. Our Approach: Utilization-Focused Evaluation • Focuses on intended uses and users • Is inherently participatory and collaborative by actively involving primary intended users in all aspects of the evaluation • Leads to ongoing, longer-term commitment to using evaluation logic and building a culture of learning in a program or organization • Symbiotic rather than parasitic

  11. Benefits of Evaluation • Program/organizational improvement • Accountability to funders and others • Planning • Program description for stakeholders • Public relations • Fund raising • Policy decision making Evaluation has lots of upside!

  12. Planning the Evaluation

  13. Elements of the Evaluation Plan • Who conducts the evaluation? • Internal or external? • Experienced or novice? • When do they do it? • Along the way or after the fact? • How much do they do? • The level of intensity must fit the project • Too much diverts resources, too little leaves unanswered questions • What exactly do they do? • Six major steps

  14. Evaluation Steps 1. Clarify project & goals 2. Establish measures 3. Collect data 4. Analyze data 5. Prepare reports 6. Improve project

  15. Thinking about goals What are you trying to accomplish? What would success look like? What is the difference between the current state of affairs and what you are trying to create? Example of a goal statement: “Increase incomes of low-income families in the region through training for entry-level jobs that have career ladders leading to good jobs.” Step 1: Clarify Project & Goals

  16. Does the Project Hang Together? • Are the expected outcomes realistic? • Are there enough resources? • Do the customers like the product? • Does the organization have the right skills? Logic models help answer these questions.

  17. A Simple Logic Model Inputs Activities Outputs Outcomes Things needed to run the project: People, resources, money, etc. What you do: Market, recruit, design, train, place, etc. Direct results of activities: Training completers,credentialsawarded, etc. Changes caused by the project: Jobs, wages, promotions, etc.

  18. Determine performance measures Must be quantifiable Data must be available, reliable, and valid Examples of measures: Activity: Number of training sessions Output: Number of trainees Outcome: Skill and credential gains Impact: Stronger local workforce Step 2: Establish Measures

  19. Step 3: Collect Data • Identify data sources, such as: • Administrative records • Surveys, interviews, focus groups • Observation • Gather data Design the instruments and procedures for collection. Conduct data collection periodically. • Record data • Organize data. • Create database. • Verify data. Remember the measures!

  20. Step 4: Analyze and Interpret Data • Sort and sift: organize data for interpretation • Cross-tabs • Modeling • Conduct data analysis to look for: • Changes over time • Progress relative to goals or standards • Differences between groups • Test preliminary interpretation This is the most creative step.

  21. Step 5: Prepare Reports • Determine reporting schedule • Report preliminary findings to key stakeholders and other audiences • Gather reactions • Incorporate reactions • Finalize reporting products Different audiences need different types of reports.

  22. Step 6: Improve Project • Deliver reporting products internally. • Facilitate strategic and operational planning. • Improve processes and results. A good evaluation will be more valuable to you than to DOL!

  23. The Evaluation Process for Earmark Grants

  24. Use the DOL Tools • “The Essential Guide for Writing an Earmark Grant Proposal” • “Evaluation Template for Earmark Grantees” (to be provided later)

  25. Discussion

  26. Thanks to … … for the use of the “Demystifying Evaluation” materials. • Useful evaluation links: • W.K. Kellogg Foundation: www.wkkf.org/Programming/Overview.aspx?CID=281 • American Evaluation Association:www.eval.org/EvaluationLinks/default.htm • Western Michigan University Evaluation Checklists:www.wmich.edu/evalctr/checklists/checklistmenu.htm

  27. Earmark Grant Evaluation:An Introduction and Overview Presented by: Jeff Padden, President Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com May 2005

More Related