Performance management for justice information sharing
Download
1 / 50

Performance Management for Justice Information Sharing - PowerPoint PPT Presentation


  • 64 Views
  • Uploaded on

Performance Management for Justice Information Sharing. David J. Roberts Global Justice Consulting Steve Prisoc Chief Information Officer New Mexico State Courts Elizabeth Zwicker Program Specialist US Bureau of Justice Assistance 2006 BJA/SEARCH Regional Information Sharing Conference

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Performance Management for Justice Information Sharing' - barr


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Performance management for justice information sharing

Performance Management for Justice Information Sharing

David J. Roberts

Global Justice Consulting

Steve Prisoc

Chief Information Officer

New Mexico State Courts

Elizabeth Zwicker

Program Specialist

US Bureau of Justice Assistance

2006 BJA/SEARCH

Regional Information Sharing Conference

March 27, 2007 Minneapolis, Minnesota


“Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

—H. James Harrington


Why evaluate performance
WHY evaluate performance? eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Information is control

  • Provides feedback to improve program performance

  • Provides information for resource allocation

  • Enables effective planning

  • Tests generalizations based on experiences and assumptions

  • Market and develop support among funding bodies, constituents, and staff.


Landscape of performance management
Landscape of Performance Management eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Investment appraisal and benefits realization

    • What is the actual investment we’re making?

    • How are benefits going to be collected and tracked?

  • Solid program management and tracking

    • Is the project on track?

    • How do we ensure it remains on track?

  • Achievement of the strategic objectives

    • Fundamentally, what is it that we’re trying to achieve in our information sharing initiative?


Process vs impact evaluations
Process vs. Impact Evaluations eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Process evaluations focus on how the initiative was executed; the activities, efforts, and workflow associated with the response. Process evaluations ask whether the response occurred as planned, and whether all components worked as intended. Fundamentally, a process evaluation posits the question, “Are we doing the thing right?”

  • Impact evaluations focus on the outcome (the what) of the initiative; the output (products and services) and outcome (results, accomplishment, impact). Did the problem decline or cease? And if so, was the response the proximate cause of the decline? Fundamentally, the impact evaluation posits the question, “Are we doing the right thing(s)?”


Balanced scorecard
Balanced Scorecard eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

Originally developed in business by Kaplan & Norton

  • Financial – How do we look to stakeholders?

  • Customer – How well do we satisfy our internal and external customers’ needs?

  • Internal Business Process – How well do we perform at key internal business processes?

  • Learning and Growth – Are we able to sustain innovation, change, and continuous improvement?


Balanced scorecard for law enforcement mark moore et al
Balanced Scorecard for eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”Law Enforcement (Mark Moore, et al)

  • Reduce criminal victimization

  • Call offenders to account

  • Reduce fear and enhance personal security

  • Guarantee safety in public spaces

  • Use financial resources fairly, efficiently, and effectively

  • Use force and authority fairly, efficiently, and effectively

  • Satisfy customer demands/achieve legitimacy with those policed


Trial court performance standards
Trial Court Performance Standards eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Access to Justice

  • Expedition and Timeliness

  • Equality, Fairness and Integrity

  • Independence and Accountability

  • Public Trust and Confidence.


Corrections performance
Corrections Performance eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Activity

    • Work & industry

    • Education & training

    • Religion…

  • Justice

    • Staff fairness

    • Use of force

    • Grievances (# & type)...

  • Conditions

    • Space

    • Pop density

    • Freedom of movement…

  • Management

    • Satisfaction

    • Stress & burnout

    • Turnover…

  • Security

    • Drug Use

    • Significant Incidents

    • Community Exposure…

  • Safety

    • …of inmates

    • …of staff

    • …of environment…

  • Order

    • Inmate misconduct

    • Use of force

    • Perceived control…

  • Care

    • Stress & illness

    • Health care

    • Dental care…


Universal ijis elements
Universal IJIS Elements eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Definition: The ability to access and share critical information at key decision points throughout the whole of the justice enterprise.

  • Scope: Recognition that the boundaries of the enterprise are increasingly elastic—engaging not only justice, but also emergency & disaster management, intelligence, homeland security, first responders, health & social services, private industry, the public, etc.

  • Goal: Get the right information, to the right people, all of the time—underscores the need for dynamic information exchange.


Information sharing objectives
Information Sharing Objectives eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • What is the problem we’re addressing?

  • What information do we have regarding current levels of performance?

  • What is it that we’re trying to do?

  • 3 Universal Objectives:

    • Improve Public Safety and Homeland Security;

    • Enhance the Quality and Equality of Justice;

    • Gain Operational Efficiencies, Effectiveness, and demonstrate Return on Investment (ROI).


Sample public safety measures

Decrease the amount of time it takes to serve a warrant eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

Decrease the amount of time for law enforcement to have details on protection orders.

Reduce the amount of time it takes users of the integrated justice system to respond to a request from the public

Reduce the time it takes to complete a criminal history background check

Reduce the number of agencies that can’t communicate with each other.

Sample Public Safety Measures

  • Increase the percentage of court dispositions that can be matched to an arrest—this will improve the quality of the computerized criminal history records

  • Decrease the average response time to establish a positive identification following an arrest

  • Reduce the number of incidents of criminal records being associated with the wrong person

  • Reduce recidivism

  • Reduce the fear of crime in target neighborhoods


Jnet improved public safety homeland security
JNET: Improved Public Safety eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” & Homeland Security

  • Notifications

    • Timely notification of critical events

      • Arrest, disposition, warrant, violation, death, etc

    • Offender accountability and increased public safety.

      • Confirmed Notifications

        • FY01/02         3,645 

        • FY02/03         18,349

        • FY03/04         29,980

        • FY04/05         33,264

        • FY05/06         46,424

        • Total = 178,339 confirmed notifications


Sample quality of justice measures
Sample Quality of Justice Measures eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

  • Reduce the average time a defendant is held while waiting for a bond decision

  • Reduce the time it takes for correctional facility intake

  • Reduce the number of days it takes to process cases from arrest to disposition

  • Reduce the number of false arrests.

  • Reduce the amount of missing data.

  • Reduce the number of civilian complaints against local law enforcement

  • Reduce the number of continuances per case that result from scheduling conflicts between the courts, law enforcement, and prosecution

  • Reduce the number of cases without a next scheduled event

  • Reduce the average number of days or hours from arrest to arraignment


Jnet improvement in the quality of justice
JNET: Improvement in the eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” Quality of Justice

  • Improved decision making

    • At key decision points, providing the required information in a timely, usable method

      • Traffic Stop

        • Who is this person? Positive identification (photo, SID, etc)

        • Is this person wanted? Outstanding warrants/wants.

        • Is this person a threat? Previous history of violent behavior, firearms, etc.

  • Enhanced Overall Data Quality

    • Reduction of errors

    • Accurate and timely statistical reporting

  • Improve Business Process

    • Minimize offender processing time

    • Reduction in “holding” time.


  • Sample efficiency effectiveness measures

    Reduce the number of hours that staff spend entering data manually or electronically

    Reduce the costs of copying documents for justice organizations

    Reduce the number of hours spent filing documents manually

    Reduce the number of hours spent searching other governmental databases

    Increase the number of law enforcement personnel performing community policing tasks, instead of administrative tasks

    Reduce the amount of missing information in criminal justice databases

    Reduce the number of corrections needed in databases maintained by CJIS agencies

    Decrease the number of warrants that never get entered into the state registry

    Increase the number of query hits on each agency database

    Reduce the number of hours it takes to enter a court disposition into the state criminal history repository

    Sample Efficiency/Effectiveness Measures


    Jnet efficient and effective roi
    JNET: Efficient and Effective ROI manually or electronically

    • PennDOT (DMV) Certified Drivers History via JNET

      • In 2003, the PennDOT processed 157,840 certified driving history requests for local police, district attorneys, and the Courts

      • One clear performance measure is highlighted by the dramatic reduction in processing costs for PennDOT. The personnel cost metric is based on the time required to process a paper copy of the driver history request, including the manual application of an embossed certification seal. PennDOT calculates their personnel cost at $1.50 per certified history processed, and when incorporating a combined printing and mailing cost of $.50 per copy, the total cost to manually generate a certified driver history equates to $2.00 per request.

      • During August 2006, the 56,126 certified driving history requests process by JNET saved PennDOT $112,252 in monthly operating expenses. Only 4767 were processed in the traditional fashion.

      • PennDOT has reallocated personnel to support and process other areas of business such as ‘paid’ requests from individual citizens and pre-employment screeners.


    Critical assumptions
    Critical Assumptions manually or electronically

    • Baseline data exist regarding current or historical performance of the system

    • Access, ability and willingness to capture data regarding on-going performance

    • Timely, accurate and complete data collection

    • Appropriate and sufficiently detailed analysis techniques

    • Staff to conduct the analysis and reports

    • Effective communication mechanisms to:

      • Monitor on-going baseline performance

      • Constantly assess the impact and operations

    • Political will and operational capacity to do something as a result of what the measures show!


    Performance dashboards

    The threat level in the airline sector manually or electronically

    is HIGH or Orange 3/1/07

    Performance Dashboards

    What we’re NOT talking about:


    What we are talking about
    What we ARE talking about… manually or electronically


    Sample performance dashboard
    Sample Performance Dashboard manually or electronically

    Draft dashboard assessing performance on a series of dimensions that have been agreed by key decisionmakers.

    This requires effective data collection and routine reporting from operational systems in place throughout the County and agreement that we’re going to do something with the data in order to respond to critical performance elements.


    Establishing a performance management program
    Establishing a Performance manually or electronicallyManagement Program

    The Six Steps to Establishing a Performance-Based Management Program

    Source: Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001)


    Outcomes and performance measures
    Outcomes and performance measures manually or electronically

    • Outcomes are the benefits or results gained by reaching goals, achieving objectives and resolving strategic issues

    • Performance Measures are specific, measurable, time-bound expressions of future accomplishment that relate to goals, objectives and strategic initiatives

    • Goals, objectives and strategic initiatives should ideally lead to outcomes

    • Pragmatic performance measurement planners recognize that not all things that need to be measured can always be empirically linked to outcomes.


    Not all outcomes easily lend themselves to measurement manually or electronically

    Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.

    —Albert Einstein

    • It is important that performance measures be based on criteria that correspond to desired outcomes; however, it is often difficult or even impossible to obtain objective measures of certain key outcomes.


    Populated logic model

    Program Logic Model and Chain of Events manually or electronically

    Category

    Program Feature and Activity

    Initial Outcomes

    Intermediate Outcomes

    Intermediate

    Outcomes II

    Final Outcomes Reached

    Measures

    Rap sheet information of appropriate scope, timeliness, accuracy and ease of use available to magistrate judge at first court appearance/bond hearing

    1. Greater use of Rap sheet information when setting bail/bond and conditions of release

    1.More appropriate conditions of release and establishment of bail/bond appropriate to both the arrest charges and the criminal history and past warrant information

    1. Fewer crimes committed by those awaiting trial

    2. Fewer failures to appear

    3. More timely disposition of criminal cases

    • Enhanced justice process

    • 2. Positive influence on lessening total number of crimes committed

    Populated logic model


    Use scenario approach to reach agreement and define performance
    Use scenario approach to reach agreement and define performance

    • Bring stakeholders together to reach consensus on the desired state of integration

    • Define the current state of integration (baseline)

    • Quantify gap between current state and desired state

    • Define desired outcomes

    • Develop objectives and performance measures that can be linked to desired outcomes


    Stakeholders must agree on performance measures in advance
    Stakeholders must agree on performance measures in advance performance

    • Perceived performance is an agreed-upon construct

    • Criteria for defining performance should be negotiated by stakeholders (and governing body) prior to developing measures

    • Stakeholders will value outcomes differently depending on their role within (or relative to) the justice enterprise


    Characteristics of good measures
    Characteristics of good measures performance

    • Measures link back to goals, objectives and mission statements

    • Measures drive the right behavior from employees, partners and consultants

    • Collecting data on measures is feasible and cost effective

    • Measures are understood by most employees

    • Measures lend themselves to being tracked on an ongoing basis so that drops in performance can be detected when there is time to do something about it.

    • Measures represent aspects of performance that we can actually change


    Performance measurement caveats
    Performance measurement caveats performance

    • Most people (including your employees and consultants) can learn to make measures come out the way they thinkyou want them to, without actually improving a process

    • Always question the measures you’ve defined, keeping in mind that the people applying them could find ways of boosting the measures without really improving anything

    • Test each measure to determine if it operates as expected. Does it always go one way when things get better and the other when things get worse?


    The russian nail
    The Russian Nail performance

    • Manipulating a single metric allowed Soviet managers to appear successful even though their efforts did not always lead to expected outcomes.

    • Success was typically measured by singular metrics of gross output, such as weight, quantity, square feet, or surface area. Gross output indicators played havoc with assortments, sizes, quality, etc., and frequently resulted in products like Khrushchev’s chandeliers – so heavy “that they pull the ceilings down on our heads.”

    • A famous Soviet cartoon depicted the manager of a nail factory being given the Order of Lenin for exceeding his tonnage. Two giant cranes were pictured holding up one giant nail.

    My Time with Soviet Economics by Paul Craig Roberts

    (Published in The Independent Review, v.VII, n.2, Fall 2002,pp. 259– 264.)


    Behavior driven the wrong way
    Behavior driven the wrong way performance

    • The Soviet Union wasted billions searching for oil because it rewarded drilling crews on the basis of the number of feet drilled. Because it is easier to drill many shallow wells than a few deep wells, drillers drilled lots of shallow wells, regardless of what was advisable geologically.

    • The 1983 Chicago Sun Times article reported a Soviet hospital that had turned away a seriously ill patient because "they were nearing their yearly quota for patient deaths—and would be criticized by authorities if they exceeded it."


    Family of related measures
    Family of related measures performance

    • Produce x widgets per hour

    • Produce x widgets per hour without exceeding y dollars

    • Produce x widgets per hour without exceeding y dollars with only one full-time employee

    • Produce x widgets per hour without exceeding y dollars and with only one full-time employee and generating z units of waste

    • Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate

    • Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate and without contributing to global warming


    The widget family
    The “widget” family performance

    • Number produced within specified time period

    • Cost of producing widgets

    • People required

    • Waste generated

    • Defect rate

    • CO2produced


    Specific justice example
    Specific performanceJustice Example


    Legislatively imposed measure
    Legislatively imposed measure performance

    Number of calls from users requesting assistance (lower number indicates superior performance)


    Replacement multi-dimensional measure performance

    Measure: Length of time to resolve a call for service and the quality of service call resolution as measured by the following two dimensions:

    1.Average time from the opening of a service ticket to the closing of a service ticket. JID will also report the median and standard deviation with the average. 2. The quality of service as measured by a regular user surveys designed to measure the quality of the service provided to the caller. Survey respondents are selected randomly


    Strategic goal 3 identify and recommend cost effective biometric identification applications
    Strategic Goal 3: Identify and recommend cost performanceeffective biometric identification applications

    Objective 3.1:

    By September 2004, research, identify, and recommend technological applications that support biometrics for rapid identification.

    Objective 3.2:

    By September 2004, research, identify, and evaluate the costs and benefits of

    biometric identification applications.

    Outcomes:

    • Increased knowledge of biometric technologies

    • Improved cost-effective biometric identification solutions

    Performance Measures:

    • Number of research projects on biometric technological solutions completed by September 2004

    • Number of research projects on costs and benefits of biometrics completed by September 2004

    • Number of research reports presented to the Governing Body


    Justice performance measures
    Justice performance measures performance

    • Average law enforcement response time to calls for service for incidents involving a threat to citizen safety

    • Percent of arrest records in state repository with final dispositions

    • Number of automated information exchanges within and between criminal justice agencies

    • Number of crimes cleared using AFIS system(s)

    • Number of arrests made of wanted individuals resulting from the use of electronically available warrant and detainer information

    • Number of electronic checks of justice databases performed to identify high risk individuals

    • Average time from arrest to final case disposition for felony arrests


    What makes a performance measure effective
    What makes a performance measure effective? performance

    • First and foremost, to be effective a measure must be an incentive to a person or group of persons to change behavior in such a way that things really improve.

    • A performance measure should provide feedback to a person or group of persons. Without feedback no information is available on whether the target implied by the measure is being met.

    • A performance measure (or family of measures) should be precise and comprehensive so as to prevent the possibility of the measure being met without actually leading to expected outcomes.


    Three-legged Stool performance

    Strategic Planning

    Performance Management

    Project Management


    The role of project plans
    The role of project plans performance

    • Project plans can augment a performance plan by ensuring that outputs are completed on time and on budget

    • Rigorous project management can ensure that tasks are actually performed before they are measured.

    • Project planning, along with strategic planning, is an essential adjunct to any performance management program.


    There s more to management than measurement
    There’s more to management than measurement performance

    If you can’t measure it, you can’t manage it.

    —Peter Drucker

    Drucker’s saying has convinced some managers that measurement is management, which is a bit of an overstatement; however, measurement is one of the most powerful tools in management toolbox


    Final points
    Final points performance

    • If you don’t monitor your performance it will probably get worse.

    • You can’t devise performance measures in a vacuum, you must involve stakeholders and measure what’s valued.

    • Don’t devise measures for which you lack data.

    • Performance measurement can be expensive and time consuming so why bother unless you intend to use the results to provide ongoing process feedback.

    • Errors in devising measures will lead to unexpected consequences


    So inscrutable is the arrangement of causes and consequences in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.

    Thomas Jefferson


    Performance measurement bja s perspective

    Performance Measurement in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.BJA’s Perspective

    Elizabeth Zwicker


    Purposes performance measures
    Purposes: Performance Measures in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.

    • Linking people and dollars to performance

    • Linking programs and resources to results

    • Justification of continued funding

    • Learning and management tools for us, for you


    What does bja do with the data
    What Does BJA Do With the Data? in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.

    • GPRA: Government Performance and Results Act

    • PART: Program Assessment Rating Tool (www.expectmore.gov)

    • Budget formulation

    • MD&A: Management Discussion and Analysis


    How you ll report performance measures
    How You’ll Report Performance Measures in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.

    • Via the semi-annual progress report submitted electronically via GMS (Grant Management System) Due Jan 30st and July 30st

    • Report only on grant-funded activities during the specified reporting period

    • Progress reports will not be accepted without complete data


    Resources
    Resources in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants.

    • Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001) at http://www.orau.gov/pbm/pbmhandbook/pbmhandbook.html

    • John E. Eck, Assessing Responses to Problems: An Introductory Guide for Police Problem-Solvers (Washington, DC: Center for Problem-Oriented Policing, no date), at http://www.popcenter.org/Tools/tool-assessing.htm

    • Michael Geerken, The Art of Performance Measurement for Criminal Justice Information System Projects, (Washington, DC: U.S. Department of Justice, Bureau of Justice Assistance, 2006 [forthcoming])

    • Robert H. Langworthy (ed.), Measuring What Matters: Proceedings from the Policing Research Institute Meetings, (Washington, DC: NIJ/COPS, July 1999, NCJ 170610), pp. 37-53.

    • David J. Roberts, Law Enforcement Tech Guide: Creating Performance Measures that Work! A Guide for Law Enforcement Executives and Managers, (Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services, 2006) at http://www.cops.usdoj.gov/mime/open.pdf?Item=1968


    ad