1 / 50

Performance Management for Justice Information Sharing

Performance Management for Justice Information Sharing. David J. Roberts Global Justice Consulting Steve Prisoc Chief Information Officer New Mexico State Courts Elizabeth Zwicker Program Specialist US Bureau of Justice Assistance 2006 BJA/SEARCH Regional Information Sharing Conference

barr
Download Presentation

Performance Management for Justice Information Sharing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Management for Justice Information Sharing David J. Roberts Global Justice Consulting Steve Prisoc Chief Information Officer New Mexico State Courts Elizabeth Zwicker Program Specialist US Bureau of Justice Assistance 2006 BJA/SEARCH Regional Information Sharing Conference March 27, 2007 Minneapolis, Minnesota

  2. “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” —H. James Harrington

  3. WHY evaluate performance? • Information is control • Provides feedback to improve program performance • Provides information for resource allocation • Enables effective planning • Tests generalizations based on experiences and assumptions • Market and develop support among funding bodies, constituents, and staff.

  4. Landscape of Performance Management • Investment appraisal and benefits realization • What is the actual investment we’re making? • How are benefits going to be collected and tracked? • Solid program management and tracking • Is the project on track? • How do we ensure it remains on track? • Achievement of the strategic objectives • Fundamentally, what is it that we’re trying to achieve in our information sharing initiative?

  5. Process vs. Impact Evaluations • Process evaluations focus on how the initiative was executed; the activities, efforts, and workflow associated with the response. Process evaluations ask whether the response occurred as planned, and whether all components worked as intended. Fundamentally, a process evaluation posits the question, “Are we doing the thing right?” • Impact evaluations focus on the outcome (the what) of the initiative; the output (products and services) and outcome (results, accomplishment, impact). Did the problem decline or cease? And if so, was the response the proximate cause of the decline? Fundamentally, the impact evaluation posits the question, “Are we doing the right thing(s)?”

  6. Balanced Scorecard Originally developed in business by Kaplan & Norton • Financial – How do we look to stakeholders? • Customer – How well do we satisfy our internal and external customers’ needs? • Internal Business Process – How well do we perform at key internal business processes? • Learning and Growth – Are we able to sustain innovation, change, and continuous improvement?

  7. Balanced Scorecard for Law Enforcement (Mark Moore, et al) • Reduce criminal victimization • Call offenders to account • Reduce fear and enhance personal security • Guarantee safety in public spaces • Use financial resources fairly, efficiently, and effectively • Use force and authority fairly, efficiently, and effectively • Satisfy customer demands/achieve legitimacy with those policed

  8. Trial Court Performance Standards • Access to Justice • Expedition and Timeliness • Equality, Fairness and Integrity • Independence and Accountability • Public Trust and Confidence.

  9. Corrections Performance • Activity • Work & industry • Education & training • Religion… • Justice • Staff fairness • Use of force • Grievances (# & type)... • Conditions • Space • Pop density • Freedom of movement… • Management • Satisfaction • Stress & burnout • Turnover… • Security • Drug Use • Significant Incidents • Community Exposure… • Safety • …of inmates • …of staff • …of environment… • Order • Inmate misconduct • Use of force • Perceived control… • Care • Stress & illness • Health care • Dental care…

  10. Universal IJIS Elements • Definition: The ability to access and share critical information at key decision points throughout the whole of the justice enterprise. • Scope: Recognition that the boundaries of the enterprise are increasingly elastic—engaging not only justice, but also emergency & disaster management, intelligence, homeland security, first responders, health & social services, private industry, the public, etc. • Goal: Get the right information, to the right people, all of the time—underscores the need for dynamic information exchange.

  11. Information Sharing Objectives • What is the problem we’re addressing? • What information do we have regarding current levels of performance? • What is it that we’re trying to do? • 3 Universal Objectives: • Improve Public Safety and Homeland Security; • Enhance the Quality and Equality of Justice; • Gain Operational Efficiencies, Effectiveness, and demonstrate Return on Investment (ROI).

  12. Decrease the amount of time it takes to serve a warrant Decrease the amount of time for law enforcement to have details on protection orders. Reduce the amount of time it takes users of the integrated justice system to respond to a request from the public Reduce the time it takes to complete a criminal history background check Reduce the number of agencies that can’t communicate with each other. Sample Public Safety Measures • Increase the percentage of court dispositions that can be matched to an arrest—this will improve the quality of the computerized criminal history records • Decrease the average response time to establish a positive identification following an arrest • Reduce the number of incidents of criminal records being associated with the wrong person • Reduce recidivism • Reduce the fear of crime in target neighborhoods

  13. JNET: Improved Public Safety & Homeland Security • Notifications • Timely notification of critical events • Arrest, disposition, warrant, violation, death, etc • Offender accountability and increased public safety. • Confirmed Notifications • FY01/02         3,645  • FY02/03         18,349 • FY03/04         29,980 • FY04/05         33,264 • FY05/06         46,424 • Total = 178,339 confirmed notifications

  14. Sample Quality of Justice Measures • Reduce the average time a defendant is held while waiting for a bond decision • Reduce the time it takes for correctional facility intake • Reduce the number of days it takes to process cases from arrest to disposition • Reduce the number of false arrests. • Reduce the amount of missing data. • Reduce the number of civilian complaints against local law enforcement • Reduce the number of continuances per case that result from scheduling conflicts between the courts, law enforcement, and prosecution • Reduce the number of cases without a next scheduled event • Reduce the average number of days or hours from arrest to arraignment

  15. JNET: Improvement in the Quality of Justice • Improved decision making • At key decision points, providing the required information in a timely, usable method • Traffic Stop • Who is this person? Positive identification (photo, SID, etc) • Is this person wanted? Outstanding warrants/wants. • Is this person a threat? Previous history of violent behavior, firearms, etc. • Enhanced Overall Data Quality • Reduction of errors • Accurate and timely statistical reporting • Improve Business Process • Minimize offender processing time • Reduction in “holding” time.

  16. Reduce the number of hours that staff spend entering data manually or electronically Reduce the costs of copying documents for justice organizations Reduce the number of hours spent filing documents manually Reduce the number of hours spent searching other governmental databases Increase the number of law enforcement personnel performing community policing tasks, instead of administrative tasks Reduce the amount of missing information in criminal justice databases Reduce the number of corrections needed in databases maintained by CJIS agencies Decrease the number of warrants that never get entered into the state registry Increase the number of query hits on each agency database Reduce the number of hours it takes to enter a court disposition into the state criminal history repository Sample Efficiency/Effectiveness Measures

  17. JNET: Efficient and Effective ROI • PennDOT (DMV) Certified Drivers History via JNET • In 2003, the PennDOT processed 157,840 certified driving history requests for local police, district attorneys, and the Courts • One clear performance measure is highlighted by the dramatic reduction in processing costs for PennDOT. The personnel cost metric is based on the time required to process a paper copy of the driver history request, including the manual application of an embossed certification seal. PennDOT calculates their personnel cost at $1.50 per certified history processed, and when incorporating a combined printing and mailing cost of $.50 per copy, the total cost to manually generate a certified driver history equates to $2.00 per request. • During August 2006, the 56,126 certified driving history requests process by JNET saved PennDOT $112,252 in monthly operating expenses. Only 4767 were processed in the traditional fashion. • PennDOT has reallocated personnel to support and process other areas of business such as ‘paid’ requests from individual citizens and pre-employment screeners.

  18. Critical Assumptions • Baseline data exist regarding current or historical performance of the system • Access, ability and willingness to capture data regarding on-going performance • Timely, accurate and complete data collection • Appropriate and sufficiently detailed analysis techniques • Staff to conduct the analysis and reports • Effective communication mechanisms to: • Monitor on-going baseline performance • Constantly assess the impact and operations • Political will and operational capacity to do something as a result of what the measures show!

  19. The threat level in the airline sector is HIGH or Orange 3/1/07 Performance Dashboards What we’re NOT talking about:

  20. What we ARE talking about…

  21. Sample Performance Dashboard Draft dashboard assessing performance on a series of dimensions that have been agreed by key decisionmakers. This requires effective data collection and routine reporting from operational systems in place throughout the County and agreement that we’re going to do something with the data in order to respond to critical performance elements.

  22. Establishing a Performance Management Program The Six Steps to Establishing a Performance-Based Management Program Source: Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001)

  23. Outcomes and performance measures • Outcomes are the benefits or results gained by reaching goals, achieving objectives and resolving strategic issues • Performance Measures are specific, measurable, time-bound expressions of future accomplishment that relate to goals, objectives and strategic initiatives • Goals, objectives and strategic initiatives should ideally lead to outcomes • Pragmatic performance measurement planners recognize that not all things that need to be measured can always be empirically linked to outcomes.

  24. Not all outcomes easily lend themselves to measurement Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted. —Albert Einstein • It is important that performance measures be based on criteria that correspond to desired outcomes; however, it is often difficult or even impossible to obtain objective measures of certain key outcomes.

  25. Program Logic Model and Chain of Events Category Program Feature and Activity Initial Outcomes Intermediate Outcomes Intermediate Outcomes II Final Outcomes Reached Measures Rap sheet information of appropriate scope, timeliness, accuracy and ease of use available to magistrate judge at first court appearance/bond hearing 1. Greater use of Rap sheet information when setting bail/bond and conditions of release 1.More appropriate conditions of release and establishment of bail/bond appropriate to both the arrest charges and the criminal history and past warrant information 1. Fewer crimes committed by those awaiting trial 2. Fewer failures to appear 3. More timely disposition of criminal cases • Enhanced justice process • 2. Positive influence on lessening total number of crimes committed Populated logic model

  26. Use scenario approach to reach agreement and define performance • Bring stakeholders together to reach consensus on the desired state of integration • Define the current state of integration (baseline) • Quantify gap between current state and desired state • Define desired outcomes • Develop objectives and performance measures that can be linked to desired outcomes

  27. Stakeholders must agree on performance measures in advance • Perceived performance is an agreed-upon construct • Criteria for defining performance should be negotiated by stakeholders (and governing body) prior to developing measures • Stakeholders will value outcomes differently depending on their role within (or relative to) the justice enterprise

  28. Characteristics of good measures • Measures link back to goals, objectives and mission statements • Measures drive the right behavior from employees, partners and consultants • Collecting data on measures is feasible and cost effective • Measures are understood by most employees • Measures lend themselves to being tracked on an ongoing basis so that drops in performance can be detected when there is time to do something about it. • Measures represent aspects of performance that we can actually change

  29. Performance measurement caveats • Most people (including your employees and consultants) can learn to make measures come out the way they thinkyou want them to, without actually improving a process • Always question the measures you’ve defined, keeping in mind that the people applying them could find ways of boosting the measures without really improving anything • Test each measure to determine if it operates as expected. Does it always go one way when things get better and the other when things get worse?

  30. The Russian Nail • Manipulating a single metric allowed Soviet managers to appear successful even though their efforts did not always lead to expected outcomes. • Success was typically measured by singular metrics of gross output, such as weight, quantity, square feet, or surface area. Gross output indicators played havoc with assortments, sizes, quality, etc., and frequently resulted in products like Khrushchev’s chandeliers – so heavy “that they pull the ceilings down on our heads.” • A famous Soviet cartoon depicted the manager of a nail factory being given the Order of Lenin for exceeding his tonnage. Two giant cranes were pictured holding up one giant nail. My Time with Soviet Economics by Paul Craig Roberts (Published in The Independent Review, v.VII, n.2, Fall 2002,pp. 259– 264.)

  31. Behavior driven the wrong way • The Soviet Union wasted billions searching for oil because it rewarded drilling crews on the basis of the number of feet drilled. Because it is easier to drill many shallow wells than a few deep wells, drillers drilled lots of shallow wells, regardless of what was advisable geologically. • The 1983 Chicago Sun Times article reported a Soviet hospital that had turned away a seriously ill patient because "they were nearing their yearly quota for patient deaths—and would be criticized by authorities if they exceeded it."

  32. Family of related measures • Produce x widgets per hour • Produce x widgets per hour without exceeding y dollars • Produce x widgets per hour without exceeding y dollars with only one full-time employee • Produce x widgets per hour without exceeding y dollars and with only one full-time employee and generating z units of waste • Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate • Produce x widgets per hour without exceeding y dollars with only one full-time employee and generating z units of waste and at a zero defect rate and without contributing to global warming

  33. The “widget” family • Number produced within specified time period • Cost of producing widgets • People required • Waste generated • Defect rate • CO2produced

  34. Specific Justice Example

  35. Legislatively imposed measure Number of calls from users requesting assistance (lower number indicates superior performance)

  36. Replacement multi-dimensional measure Measure: Length of time to resolve a call for service and the quality of service call resolution as measured by the following two dimensions: 1.Average time from the opening of a service ticket to the closing of a service ticket. JID will also report the median and standard deviation with the average. 2. The quality of service as measured by a regular user surveys designed to measure the quality of the service provided to the caller. Survey respondents are selected randomly

  37. Strategic Goal 3: Identify and recommend cost effective biometric identification applications Objective 3.1: By September 2004, research, identify, and recommend technological applications that support biometrics for rapid identification. Objective 3.2: By September 2004, research, identify, and evaluate the costs and benefits of biometric identification applications. Outcomes: • Increased knowledge of biometric technologies • Improved cost-effective biometric identification solutions Performance Measures: • Number of research projects on biometric technological solutions completed by September 2004 • Number of research projects on costs and benefits of biometrics completed by September 2004 • Number of research reports presented to the Governing Body

  38. Justice performance measures • Average law enforcement response time to calls for service for incidents involving a threat to citizen safety • Percent of arrest records in state repository with final dispositions • Number of automated information exchanges within and between criminal justice agencies • Number of crimes cleared using AFIS system(s) • Number of arrests made of wanted individuals resulting from the use of electronically available warrant and detainer information • Number of electronic checks of justice databases performed to identify high risk individuals • Average time from arrest to final case disposition for felony arrests

  39. What makes a performance measure effective? • First and foremost, to be effective a measure must be an incentive to a person or group of persons to change behavior in such a way that things really improve. • A performance measure should provide feedback to a person or group of persons. Without feedback no information is available on whether the target implied by the measure is being met. • A performance measure (or family of measures) should be precise and comprehensive so as to prevent the possibility of the measure being met without actually leading to expected outcomes.

  40. Three-legged Stool Strategic Planning Performance Management Project Management

  41. The role of project plans • Project plans can augment a performance plan by ensuring that outputs are completed on time and on budget • Rigorous project management can ensure that tasks are actually performed before they are measured. • Project planning, along with strategic planning, is an essential adjunct to any performance management program.

  42. There’s more to management than measurement If you can’t measure it, you can’t manage it. —Peter Drucker Drucker’s saying has convinced some managers that measurement is management, which is a bit of an overstatement; however, measurement is one of the most powerful tools in management toolbox

  43. Final points • If you don’t monitor your performance it will probably get worse. • You can’t devise performance measures in a vacuum, you must involve stakeholders and measure what’s valued. • Don’t devise measures for which you lack data. • Performance measurement can be expensive and time consuming so why bother unless you intend to use the results to provide ongoing process feedback. • Errors in devising measures will lead to unexpected consequences

  44. So inscrutable is the arrangement of causes and consequences in this world that a two-penny duty on tea, unjustly imposed in a sequestered part of it, changes the condition of its inhabitants. Thomas Jefferson

  45. Performance MeasurementBJA’s Perspective Elizabeth Zwicker

  46. Purposes: Performance Measures • Linking people and dollars to performance • Linking programs and resources to results • Justification of continued funding • Learning and management tools for us, for you

  47. What Does BJA Do With the Data? • GPRA: Government Performance and Results Act • PART: Program Assessment Rating Tool (www.expectmore.gov) • Budget formulation • MD&A: Management Discussion and Analysis

  48. How You’ll Report Performance Measures • Via the semi-annual progress report submitted electronically via GMS (Grant Management System) Due Jan 30st and July 30st • Report only on grant-funded activities during the specified reporting period • Progress reports will not be accepted without complete data

  49. Resources • Will Artley, DJ Ellison and Bill Kennedy, The Performance-Based Management Handbook, Volume 1: Establishing and Maintaining a Performance-Based Management Program (Washington, DC: U.S. Department of Energy, 2001) at http://www.orau.gov/pbm/pbmhandbook/pbmhandbook.html • John E. Eck, Assessing Responses to Problems: An Introductory Guide for Police Problem-Solvers (Washington, DC: Center for Problem-Oriented Policing, no date), at http://www.popcenter.org/Tools/tool-assessing.htm • Michael Geerken, The Art of Performance Measurement for Criminal Justice Information System Projects, (Washington, DC: U.S. Department of Justice, Bureau of Justice Assistance, 2006 [forthcoming]) • Robert H. Langworthy (ed.), Measuring What Matters: Proceedings from the Policing Research Institute Meetings, (Washington, DC: NIJ/COPS, July 1999, NCJ 170610), pp. 37-53. • David J. Roberts, Law Enforcement Tech Guide: Creating Performance Measures that Work! A Guide for Law Enforcement Executives and Managers, (Washington, DC: U.S. Department of Justice, Office of Community Oriented Policing Services, 2006) at http://www.cops.usdoj.gov/mime/open.pdf?Item=1968

More Related