Performance indicators good bad and ugly
Download
1 / 13

Performance indicators: good, bad, and ugly - PowerPoint PPT Presentation


  • 78 Views
  • Uploaded on

Performance indicators: good, bad, and ugly. The report of the Royal Statistical Society working party on performance monitoring in the public services, chaired by Professor Sheila Bird.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Performance indicators: good, bad, and ugly' - dean-hoffman


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Performance indicators good bad and ugly

Performance indicators: good, bad, and ugly

The report of the Royal Statistical Society working party on performance monitoring in the public services, chaired by Professor Sheila Bird


“Performance monitoring done well is broadly productive for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”


Methodological rigour in selecting indicators
Methodological rigour in selecting indicators for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • Sample surveys should be designed, conducted and analysed in accordance with statistical theory and best practice

  • Admin data should be fully auditable

  • Concepts, questions, etc should be comparable and harmonised where possible – conforming to national or international standards as appropriate

  • Indicators should be precise and accurate enough to show reliably when change has occurred


Definitions should be precise
Definitions should be precise for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • Definitions of both indicators and targets should be

    • Precise but practicable

      • useful definitions should be given for all the key concepts in the indicator or target

    • Consistent over time

      • any changes to definitions or methods should be fully documented

    • Unambiguous

      • there should be no possibility of disagreement about whether progress is the indicator going up or down


Practitioners involved should have input
Practitioners involved should have input for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • For targets to be ambitious but achievable, a good understanding of both the practicalities of delivery on the ground, and of the data, is needed

  • To understand the practicalities of delivery, practitioners should be consulted

  • Motivational but irrational targets may demoralise


Monitor for perverse outcomes
Monitor for perverse outcomes for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • Targets can lead to practitioners playing the system rather than improving performance to meet badly thought through targets

  • An example from the report:

    • An indicator for prisons is the number of “serious” assaults on prisoners

    • “Serious” = proven prisoner-on-prisoner assault

    • The indicator would improve if prisons reduced their investigations into assaults


Do not ignore uncertainty or variability
Do not ignore uncertainty or variability for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • Insistence on single numbers as answers to complex questions is to be resisted

  • Natural variability, outliers, recording errors, statistical error (i.e. confidence intervals around sample estimates), all need to be considered

  • All need to be clearly presented


Do not set 100 targets
Do not set 100% targets for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • 100% targets can lead to perverse outcomes, demoralise when failure inevitably occurs, and lead to disproportionate resources being used

  • An example from the report:

    • “No patient shall wait in A&E for more than 4 hours”

    • This becomes irrelevant as soon as one patient does wait more than 4 hours

    • A&E staff may have very sound reasons for making a small number of people wait longer


Do not ignore the distribution
Do not ignore the distribution for those concerned. Done badly, it can be very costly and not merely ineffective but harmful and indeed destructive - of morale, reputations and the public services.”

  • Performance Indicators are 1 number

  • Single number summaries of data can be misleading

  • An example from the report:

    • “Number of patients waiting more than 4 hours”

    • The whole distribution needs viewing to understand the indicator e.g. has progress been achieved by getting most people seen in 3 hours 59 minutes but some not for 10 hours?


Do not mistake statistical significance for practical importance
Do not mistake statistical significance for practical importance

  • There can not be a difference of practical importance if the difference is not statistically significant (because the difference might not be genuine – it could just be chance)

    BUT

  • A difference could be statistically significant but not practically important (because statistical significance can be achieved by getting a huge sample size)


Consider not setting a target until data are well understood
Consider not setting a target until data are well understood importance

  • The statistical properties of an indicator will be much better understood after one or two rounds of analysis

  • It may therefore be sensible to wait before setting a target


Document everything others should be able to replicate procedures
Document everything: Others should be able to replicate procedures

  • All assumptions and methods should be fully documented so that others can fully understand and replicate results

  • A ‘PM Protocol’ should include:

    • Objectives

    • Definitions

    • Survey methods / information about data

    • Information about context

    • Risks of perverse outcomes

    • How the data will be analysed

    • Components of variation

    • Ethical, legal and confidentiality issues

    • How, when and where data will be published


The report is available on the RSS website here: procedures

http://www.rss.org.uk/PDF/Performance%20monitoring%20231003.pdf


ad