1 / 31

Developing Earmark Grant Performance Measures: Grant Proposal Section 3

Developing Earmark Grant Performance Measures: Grant Proposal Section 3. Deanna Khemani. Session Objectives. Participants will be able to: Understand the context within which your grant operates and is funded

armen
Download Presentation

Developing Earmark Grant Performance Measures: Grant Proposal Section 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Earmark Grant Performance Measures:Grant Proposal Section 3 Deanna Khemani

  2. Session Objectives • Participants will be able to: • Understand the context within which your grant operates and is funded • Identify performance measures that relate to your grant’s purpose & discuss measurement parameters • Understand the importance of having a data collection & information system

  3. Session Overview • Setting the Context • Grant Proposal Requirements & Key Terminology • Identifying Performance Measures & Other Things to Consider

  4. The Context • Your grant is funded within an existing system, is affected by that system, and draws requirements from that system • Government Performance and Results Act (GPRA) • The Workforce Investment Act of 1998 (WIA) • The President’s Management Agenda • The Labor, Health and Human Services and Education Appropriations Act

  5. The Context • GPRA or “Results Act” • Federal agencies should stop preoccupation with measuring processes and activities and should focus on bottom line results • ETA will use GRPA performance targets to evaluate the reasonableness of earmark grantee’s expected levels of performance • WIA • Seven key themes, one of the primary principles is the need for increased accountability at the federal, state and local-levels, including grantee-level

  6. The Context • President’s Management Agenda • Federal agencies need to evaluate their programs and grants, and to show achievement of desired results & ties budget decisions to performance (consistent with GPRA) • “What matters in the end is completion. Performance. Results. Not just making promises, but making good on promises.” • Appropriations Act • Contains specific language requiring DOL to assess earmark grant performance

  7. Who Cares? • Congress and the American people require results and a return on the federal investment • PEC is required to develop a Notice of Grant Award statement that outlines the project’s intended outcomes/results • Earmark grantees must develop a performance management system that: • Contains performance measures that relate to the stated goals of the project • Collects, analyzes and reports information so that the project can continuously improve and show results

  8. Performance Management • Performance management is a systematic process that involves: • Monitoring activities and subcontracts for results; • Collecting, analyzing and reporting program and fiscal information; • Using data to influence program decision-making and resource allocation; and • Communicating results to advance organizational learning and inform stakeholders • Performance management involves the entire organization

  9. Grant Proposal Elements • Your earmark grant proposal guide references the following: • Goals • Performance Measures • Expected Levels of Performance • Actual Levels of Performance

  10. Grant Proposal Elements

  11. Grant Proposal Elements • Goal • A broad, comprehensive statement that clearly defines the purpose of the grant. • In the grant proposal, you need to provide a narrative statement identifying your project goals • Example 1: To provide high-tech manufacturing training to 50 individuals. • Example 2: To analyze whether current office equipment on the market is accessible to persons with disabilities and to disseminate research findings.

  12. Grant Proposal Elements • Performance Measure • The name given to what the grantee will measure. What is it that you are measuring? • Example 1: Training Completion Rate is the “measure” of individual achievement in the training curriculum. • Example 2: Developing a report outlining accessibility issues is an obvious performance measure, but also Presentation Attendance could be a “measure” of whether the message is reaching the “right” people

  13. Grant Proposal Elements • Performance measure specifics, also referred to as measurement parameters • How the performance measure “works” • Who is included in the measure? • Who is excluded in the measure? • What is the time period for measurement? • What data sources will be used in the measurement? Implies that you have a system in place to capture the information and report it. • What level of performance is sufficient to count in the measure?

  14. Example Grant Proposal Table 1

  15. Example Grant Proposal Table 2

  16. Grant Proposal Elements • Expected Level of Performance • The numeric performance target that the grantee expects to achieve on each performance measure; sometimes referred to as a “performance standard.” • ETA will use the GPRA levels to assess the reasonableness of your performance targets

  17. Performance-Related Terminology • Actual Performance Level • The numeric performance level that the grantee achieved on each performance measure. • How you performed on the performance measure relative to the Expected Level of Performance.

  18. Example Grant Proposal Table 1

  19. Example Grant Proposal Table 2

  20. Grant Proposal Elements • Grantees are also required to provide information on who is responsible for submitting reports (if different from grantee organization) • Grantees must describe how they will maintain, track and report program data? • What types of information will be collected • How the information is secured? • How often reports will be generated? • How information collected will be used to manage the project?

  21. Developing Performance Measures • Select appropriate performance measures (see Handout I) • Many different kinds of measures (process, output, outcome) • Assess where information on the measure will originate (see Handout III) • What is evidence of success? Do you have access to the information? How will data be collected? • Track information • How will you know how you are doing (e.g., data collection system)? What information is needed? How will it be collected and tracked? • Develop reports • How will you process and aggregate the data?

  22. Where Do I Start? • Before deciding on indicators and performance measures, always start with your grant. • What’s the purpose? • Why were you funded? • What are you expected to accomplish?

  23. Using a Logic Model INPUT PROCESS OUTPUT OUTCOME IMPACT The Federal Government focuses here But many things happen before “the outcome” that can be measured. Interim or intermediate measurements might even predict the ultimate outcome!

  24. Understanding Intermediate Measures • Input Measures: “What You Invest” • Measures related to outreach and recruitment • Enrollment rates • Measures related to percentage of accepted referrals from other partners • Process Measures: “What You Do” • Attendance rates (particularly for youth) • Extent of partnering/referrals or co-enrollments • Timeliness of reports

  25. Understanding Intermediate Measures • Output Measures: “What you accomplish as a result of the product or activity” • Completion rates • % successful completions • # of exits with positive outcomes by ‘x’ time period • Outcome Measures: “What difference does the project make” • % of Participants Placed in Employment, Customer Satisfaction, and Diploma Attainment Rate • Change in policymaking

  26. Understanding Intermediate Measures • Impact Measures:“What is the effect of the project” • Societal benefits • Reduced welfare dependency (i.e., self-sufficiency) • Fewer incarcerations • You need to have a mixture of process and output measures so that you can manage performance in order to achieve desired outcomes

  27. Appropriate Earmark Measures • Demonstration/Pilot Projects; Multiservice Projects; or Multistate Projects • Curriculum Development Rate (output measure) • Training Completion Rate (output measure) • Skill Attainment Rate • Placement in Employment Rate • Employer and/or Participant-Customer Satisfaction Rates • Earnings Change or Percent Increase in Wages • Retention Rates • Cost Per Participant

  28. Appropriate Earmark Measures • Research Projects • Report completed • Community Awareness or Penetration Rate • Employer and/or Customer-Satisfaction Rates

  29. Conclusion • Just remember that your performance measures need to be tied directly to your project’s goals and that the measures should have quantified performance targets • Each performance measures should answer one of the following questions: • How well did you succeed in providing customer value? • How well did you do the services/activities, which support the creation of customer value?

  30. Example 1 A project provides high-tech manufacturing training to 50 individuals in order to help them gain high-paying employment. • Based on this goal statement, what are possible performance measures? • How could the performance measure be defined? • What would be appropriate performance parameters? • What data sources could be used to assess performance?

  31. Example 2 This project will develop a distance learning program that utilizes computer-based training in order to access students in rural areas. • Based on this goal statement, what are possible performance measures? • How could the performance measure be defined? • What would be appropriate performance parameters? • What data sources could be used to assess performance?

More Related