Performance measurement tjdr peer to peer meeting october 7 and 10 2011 washington d c
This presentation is the property of its rightful owner.
Sponsored Links
1 / 23

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C. PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C. Janet Chiancone Associate Administrator for Budget and Planning Kristen Kracke Performance Measures Coordinator. Agenda. History on Federal Performance Measurement

Download Presentation

Performance Measurement TJDR Peer to Peer Meeting October 7 and 10, 2011 Washington, D.C.

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Performance measurement tjdr peer to peer meeting october 7 and 10 2011 washington d c

Performance MeasurementTJDR Peer to Peer MeetingOctober 7 and 10, 2011Washington, D.C.

Janet Chiancone

Associate Administrator for

Budget and Planning

Kristen Kracke

Performance Measures




  • History on Federal Performance Measurement

  • Current Efforts: Where We Are Now

  • What Are Performance Measures

  • OJJDP Core Measures and TYP Data Reports

  • Why It Matters

Federal history on performance and accountability

Federal History On Performance and Accountability

Government Performance and Results Act (GPRA)

Shift from accountability for process to accountability for results

Programs must show effectiveness to justify funding


Several State-level efforts also in place

Current administration

Current Administration

Performance/Accountability a priority

Economic realities make this necessary

WH Position: “Chief Performance Officer”

Whitehouse Websites:

“If we believe the government can make a difference in people’s lives, we have the obligation to prove that it works – by making government smarter, and leaner and more effective...”

President Barack Obama

April 13, 2011

Performance management president obama s administration

Performance Management: President Obama’s Administration

Six identified strategies with the highest potential for achieving meaningful performance improvement within and across Federal agencies:

  • Driving Agency Top Priorities

  • Cutting Waste

  • Reforming Contracting

  • Closing the IT Gap

  • Promoting Accountability and Innovation through Open Government

  • Attracting and Motivation Top Talent

Performance measurement tjdr peer to peer meeting october 7 and 10 2011 washington d c

Funding and Information Flows

Programs need to show effectiveness to justify funding.

Congress and OMB






Grantees and programs

Performance measurement tjdr peer to peer meeting october 7 and 10 2011 washington d c

What Is Performance Measurement?

A system of tracking progress of the chosen activities in accomplishing specific goals, objectives, and outcomes.

Performance measurement:

  • Is directly related to program goals and objectives

  • Measures progress of the activities quantitatively

  • Is not exhaustive

  • Provides a “temperature” reading—it may not tell you everything you want to know but provides a quick and reliable gauge of selected results

Evaluation vs performance measurement

Evaluation vs. Performance Measurement

Performance Measurement



QuestionHow much?What does it mean?

ExampleGame scoreGame analysis

OffersA tallyCausality

TimeframeContinuous (Ongoing)Interval (Discrete)

CostLess expensiveMore expensive

Performance measurement is necessary, but not sufficient, for evaluation.

What are ojjdp s performance measures

What are OJJDP’s Performance Measures?

Office of juvenile justice and delinquency prevention s charge

Office of Juvenile Justice and Delinquency Prevention’s Charge

  • Authorizing legislation is the Juvenile Justice and Delinquency Prevention Act of 2002

  • Focus is on helping States and localities to respond to juvenile risk behavior and delinquency

  • Primary function of the agency is to provide program grant funding, and support research and technical assistance/training

Diversity of programs

Diversity of Programs

  • Formula, Block grants for States

  • Tribal Youth Programs

  • Discretionary competitive programs

  • Enforcing Underage Drinking Laws (block and discretionary grants)

  • Victimization grants (Amber Alert, internet safety)

  • Congressional Earmark grants

Ojjdp generally funds four types of programs projects

OJJDP Generally Funds Four Types of Programs/Projects:

  • Direct Service Prevention

  • Direct Service Intervention

  • System Improvement

  • Research and Development

Operationalizing core measures for ojjdp programs

Operationalizing Core Measures for OJJDP Programs

  • A small number of measures that directly link to OJJDP’s core directives.

  • Comparability within and across programs

  • A focus on quality services and youth outcomes

Ojjdp s core measures

OJJDP’s “Core” Measures

  • Percent of Program Youth who offend or reoffend

  • Percent of youth who are victimized

Percent of program youth who exhibit a desired change in the targeted behavior

Percent of Program Youth who exhibit a desired change in the targeted behavior.

  • Substance use

  • School attendance

  • School achievement

  • Social competence

  • Parenting

  • Gang activity

  • Cultural skill building/cultural pride

    [Several options – select most relevant behavior]

Tribal youth program typ grant

Tribal Youth Program (TYP) Grant

Core Measures Data

Typ core measures data evidence based programs

TYP Core Measures Data: Evidence-Based Programs

  • Evidence based programs and practices have been defined as “programs and practices that have been shown, through rigorous evaluation and replication, to be effective at preventing or reducing juvenile delinquency or victimization, or related risk factors.”

  • Figure 1 presents a percentile of grantees that are implementing evidence-based programs and/or practices for the TYP grant out of 115 TYP grantees reporting.

  • A significant number of TYP Grantees are implementing evidence-based programs and/or practices. This is evident from the positive increase in percentage of evidence programs implemented across all reporting periods.

  • During the January–June 2011 reporting period, approximately 39% (n=44) of TYP grantees implemented evidence-based programs and practices, totaling $16,048,726.

Figure 1. Percentage of Grantees Implementing Evidence-Based Programs and/or Practices

Typ core measures data youth served

TYP Core Measures Data: Youth Served

  • During the current reporting period (January to June 2011),15,355 youth and/or families were served, 83% of whom were youth (n=12,712) out of 115 TYP grants reporting.

  • Youth and families completed 68,841 service hours, with 92% completed by youth.

  • Regarding the rate of offending for program participants, we found that 2% of youth offended in the short term, and 17% of youth re-offended during the reporting period.

  • Reported victimization levels among youth served were also relatively low. Approximately 1% of youth tracked were victimized during the reporting period (short-term).

  • Similarly, reported re-victimization levels among youth served were also relatively low. Approximately 2% of youth tracked were re-victimized during the reporting period.

Figure 2. Number of Program Youth Served

Typ core measures data behavioral change

TYP Core Measures Data: Behavioral Change

  • As shown in Table 1, TYP grantees were required to measure performance and track data for certain target behaviors for each program category. This table lists the short-term percentiles for the specified target behavior for all program categories.

  • During the January to June 2011 reporting period, 8,529 youth received services for noted targeted behaviors.

  • Eighty-seven percent of youth exhibited a change in behavior (see Table 1).

Table 1. Target Behaviors, January–June 2011

Performance measurement

Performance Measurement

  • Accurate and timely reporting of performance measures data is an important element of project management

  • Performance measurement information is used to

    • Improve the operation of the program

    • Provide hard proof of how/when/and what your program is doing

      Note: OJJDP will begin performance measurement validation and verfication this year (TYP in future).

P rocedures for maintaining data

Procedures for Maintaining Data

  • Maintain written documentation:

    • Electronic files (Include back-up procedures)

    • Hard copies (attendance sheets, for example)

    • Keep records as proof of reported data.

  • Decide who has access to the data (limited)

  • Develop security procedures for protecting data

    • Password-protected electronic files

    • Locked drawers for paper files

    • Use ID# instead of names

  • Resources on dctat

    Resources on DCTAT

    For DCTAT questions contact

    [email protected]

    Toll-free Technical Assistance Hotline Number: 1-866-487-0512

    Listening and feedback

    Listening and Feedback:

    • What performance measures make sense? Which ones don’t?

    • What data is easiest for you to collect? What data is the hardest?

    • What’s missing?

    • What questions do you have?

  • Login