1 / 28

Earmark Grant Evaluation: An Introduction and Overview

Earmark Grant Evaluation: An Introduction and Overview. Presented by: Nancy Hewat, Senior Project Manager Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com. May 2005. Presentation Topics.

Download Presentation

Earmark Grant Evaluation: An Introduction and Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Earmark Grant Evaluation:An Introduction and Overview Presented by: Nancy Hewat, Senior Project Manager Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com May 2005

  2. Presentation Topics • The evaluation requirement for earmark grants • Evaluation overview – or – “Where’s the upside?” • Planning the evaluation • Logic modeling • The evaluation process for earmark grants Please don’t hold your questions!

  3. The Evaluation Requirement

  4. Each grantee must … • Conduct or commission an evaluation • Submit evaluation plan • Use the evaluation template • Submit evaluation report shortly after completion of project activities

  5. Evaluation Overview– or –“Where’s the upside?”

  6. Evaluation is a mindset … • We are all evaluators • Evaluation is continuous • Evaluation looks forward, not just backward • Involves organizational learning • Means people working together

  7. Program evaluation is ... • The systematic collection of information about the subject of the evaluation • Used to make decisions about organization’s or program’s: • Creation • Improvement • Effectiveness

  8. Evaluation allows you to examine ... • What’s working well • What is not • How to improve There is no bad news, only news!

  9. Evaluation looks in two directions evaluation improve program/ project quality(learning from experience) showing results past present future

  10. Evaluation requires comparison ... • of the same group over time • pre- and post-tests • trends in community-level data • of two comparable groups • at one point in time • over time • of your group to a larger group • county compared to state

  11. Our Approach: Utilization-Focused Evaluation • Focuses on intended uses and intended users • Is inherently participatory and collaborative by actively involving primary intended users in all aspects of the evaluation • Leads to ongoing, longer-term commitment to using evaluation logic and building a culture of learning in a program or organization

  12. Benefits of Evaluation • Program/organizational improvement • Accountability to funders and others • Planning • Program description for stakeholders • Public relations • Fund raising • Policy decision making Evaluation has lots of upside!

  13. Planning the Evaluation

  14. Elements of the Evaluation Plan • Who conducts the evaluation? • Internal or external? • Experienced or novice? • When do they do it? • Along the way or after the fact? • How much do they do? • The level of intensity must fit the project • Too much diverts resources, too little leaves unanswered questions • What exactly do they do? • Six major steps

  15. Evaluation Steps 1. Specify goals 2. Establish measures 3. Collect data 4. Analyze data 5. Prepare reports 6. Improve project

  16. Thinking about goals What are you trying to accomplish? What would success look like? What is the difference between the current state of affairs and what you are trying to create? Example of a goal statement: “Increase incomes of low-income families in the region through training for entry-level jobs that have career ladders leading to good jobs.” Step 1: Specify Goals

  17. Determine performance measures Must be quantifiable Data must be available, reliable, and valid Examples of measures: Process: Number of trainees Outcome: Skill and credential gains Impact: Wage increases and promotions Step 2: Establish Measures

  18. Step 3: Collect Data • Identify data sources, such as: • Administrative records • Surveys, interviews, focus groups • Observation • Gather data • Design the instruments and procedures for collection • Conduct data collection periodically • Record data • Organize data • Create data base • Verify data Remember the measures!

  19. Step 4: Analyze and Interpret Data • Sort and sift: organize data for interpretation • Cross tabs • Modeling • Conduct data analysis to look for: • Changes over time • Progress relative to goals or standards • Differences between groups • Test preliminary interpretation This is the most creative step.

  20. Step 5: Prepare Reports • Determine reporting schedule • Report preliminary findings to key stakeholders and other audiences • Gather reactions • Incorporate reactions • Finalize reporting products Different audiences need different types of reports.

  21. Report Report Report Step 6: Improve Project • Deliver reporting products internally • Facilitate strategic and operational planning • Improve processes and results A good evaluation will be more valuable to you than to DOL!

  22. Logic Modeling

  23. Does the project hang together? • Are the expected outcomes realistic? • Are there enough resources? • Do the customers like the product? • Does the organization have the right skills? Logic models help answer these questions.

  24. A Simple Logic Model Inputs Activities Outputs Outcomes Things needed to run the project: People, stuff, money, etc. What you do: Market, recruit, design, train, place, etc. Direct results of activities: Training completers,credentialsawarded, etc. Changes caused by the project: Jobs, wages, promotions, etc.

  25. Logic Models Focus on Outcomes Mission Concise statement of purpose Goal Broad statement of desired outcome Objective Measurable statement of an expected outcome over a period of time Performance Measures Ongoing quantitative indicators of objective outcome achievement

  26. The Evaluation Process for Earmark Grants

  27. Use the DOL Tools • “The Essential Guide for Writing an Earmark Grant Proposal” • “Evaluation Template for Earmark Grantees” (to be provided later)

  28. Earmark Grant Evaluation:An Introduction and Overview Presented by: Nancy Hewat, Senior Project Administrator Public Policy Associates, Inc. 119 Pere Marquette Drive Lansing, Michigan 48912-1231 (517) 485-4477 www.publicpolicy.com May 2005

More Related