1 / 19

Lesson 05 Performance Measurement

Lesson 05 Performance Measurement. So, what’s a good measure? .

leif
Download Presentation

Lesson 05 Performance Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lesson 05 Performance Measurement

  2. So, what’s a good measure? “When an airline became concerned that a crucial aspect of customer satisfaction concerned how quickly the passengers retrieved their luggage after landing, it set an objective to improve baggage delivery and introduced a new performance measure. The day after this was introduced, senior managers watched in astonishment as one of the teams of baggage handlers unloaded the cases from an incoming flight. Initially, the team members stood chatting together as they waited for the tractor to bring the baggage trucks from the aircraft. When it arrived, the team leader grabbed one small bag and threw it to the youngest member of the team. The youth caught the bag and sprinted across the tarmac with it. He reached the conveyor belt, threw on the bag, hit the start button and then sauntered back to the group. None of the others had moved. They continued chatting for several more minutes before eventually starting to unload the rest of the bags.” (Bourne, M. and Neely, A., 2002) Question: How had the measure been operationally defined? Answer: The time taken for the first bag to hit the conveyor after the aircraft had landed. Outcome: Is that what the customer wanted?

  3. How about this? In 2007, Dell senior leadership wanted to improve its call centers. The company had an employee performance measure called handle time per call defined as the “length of time a single employee stayed on the phone with a customer”. This bad measure motivated call-center employees to simply transfer callers, getting rid of complaining customer by making them someone else’s problem. Fact: 7,000 customers were likely to be transferred seven times or more on a single call. Dell senior leadership changed the measure to minutes per resolution of a problem; goal (target) of “resolution in one call”. (Jarvis, 2009, What Would Google Do?)

  4. How about you? Any bad measures stories?

  5. What about leading & lagging? Leading Indicator Lagging Measure An indirect predictor for achievement (should have at least one per objective…maybe more) A direct comprehensive assessment of objective achievement (must have one per objective) Objective: Satisfy Customer Needs New Customers: Number of new customers added Customer Satisfaction: Percentage of customers having a score of at least 4.0 on the annual survey (scale of 1.0 – 5.0)

  6. Measures vs. Indicators Objective: Satisfy Customer Needs INDICATOR MEASURE Number of customer Customer complaints Satisfaction Gain in market share Number of new Not used customers added Lag (direct assessment of actual performance) Lead (indirect predictor of performance)

  7. So, is this a good lag indicator?

  8. Measurement Strategy Quantitative Measurement Qualitative Measurement versus • Counting • Timing • Tests • Questionnaires • Checklists • Rating Forms • Observation • Surveys • Open-ended Interviews • Focus Groups • Case Study Evaluation

  9. Assignment of Values PRIMARY DATA SECONDARY DATA FACTUAL DATA 1.Record Keeping RecordingCountingCalculating sizeamountdemographics 3.Records Capture Obtaining data from databases derived from previous data collection activities 2.Instrumentation Observing/Testing attributes or performance Surveying beliefs or attitudes ABSTRACT DATA effectivenessefficiencyquality

  10. Assignment of Values (cont.) Description of Quantitative Performance Measures THEASSIGNED VALUE THE UNIT BEING MEASURED A CERTAINCHARACTERISTIC Based on to Total Number Average Number Cost or Dollars Rate Duration Type Services Products Process People Events Materials Physical Quality Timeliness Effectiveness Innovation Conditions

  11. Assignment of Values (cont.) Description of Quantitative Performance Measures RATIO of ONE MEASURE to ANOTHER Total labor hours or resources consumed by that employee Total number of products produced by an employee Productivity or Efficiency = ÷ Total number of units independent of the characteristic Total number of units having a particular characteristic ÷ = Percentage = Costs avoided or dollars gained by a program Dollars spent or invested in that program ÷ Cost-benefits

  12. Assignment of Values (cont.) Description of Quantitative Performance Measures WEIGHT MEASURE WEIGHTED SCORE No. Fatalities dueto on-job injury No. Terminationsdue to on-job injury No. Lost work days due to on-job injury No. restricted work days due to on-job injury 20 X No.FATALITIES 10 x No.TERMINATIONS 2 x No.LOST WORK DAYS 1 x No.RESTRICTED WORK DAYS 20 10 2 1 = X = X = X = X

  13. 30% 75 22.5 >80 20% 70 14.0 Good-Excellent 70-80 20% 89 17.8 Fair-Standard <70 Marginal-Unsatisfactory 30% 85 25.5 INS Contractor Performance Index Weighted Score Performance Measures Weight Score Customer Satisfaction Within Budget On Time Technical Performance Contractor Index Score = 79.8

  14. Operational Definitions At the Very Least To Understand Even More • Name & intended meaning • Operational means for collecting data • Rules/procedures for assigning values • Owner / POC • Frequency of collection • (Name of instrument) • Data Collectors • Sampling Technique • Verification Technique • Goals or Targets • Data Analysis Guidelines • Use of Information

  15. Op Def Template

  16. Examples of Performance MeasuresOutputs or Outcomes? • Department of Education Troops-to Teachers Program • Percentage of Troops-to-Teachers who remain in teaching for 3 or more years after placement in a teaching position in a high-need school (Outcome/Output) • Outcome • Department of Health and Human Services Office of Child Support Enforcement • Total dollars collected per $1 of expenditures (Outcome/Output) • Output • Patent and Trademark Office • Average Patent Pendency (Outcome/Output) [Patent pendency is the estimated time in months for a complete review of a patent application, from the filing date to issue or abandonment of the application] • Output • Food and Drug Administration • Increase consumer understanding of diet-disease relationships, and in particular, the relationships between dietary fats and the risk of coronary heart disease, the leading cause of death in the US (Outcome/Output) • Outcome

  17. Examples of Performance MeasuresOutputs or Outcomes? • Department of Justice Bureau of Alcohol, Tobacco and Firearms • Percentage of high-crime cities nationwide with a reduction in violent firearms crime (Outcome/Output) • Outcome • Department of State US Humanitarian Demining Program • Square meters of land cleared and restored to productive use in sponsored programs in countries receiving US Assistance (Outcome/Output) • Output • Department of Transportation • Fatalities per 100 million vehicle-miles of travel (VMT) (Outcome/Output) • Outcome • National Aeronautics and Space Administration • Progress in characterizing the present climate of Mars and determining how it has evolved over time (Outcome/Output) • Outcome

  18. Criteria for Effective Measures VALID RELIABLE Accurate MULTI- DIMENSIONAL CREDIBLE IMPORTANT Relevant TAMPER- PROOF ECONOMIC TIMELY SIMPLE Practical

  19. 1. No. community colleges receiving grants. (Agency records) 2. Average time to complete review of grant application. (Agency records) 3. No. visits to community colleges to discuss and review programs. (Employee trip/expense reports) 4. No. colleges visited at least 3 times per year. (Employee trip/expense reports) 5. No. spaces offered to students. (Community College reports) 6. Cost per space made available. (#5 divided by grant amount) 7. No. students completing training. (Community College reports) 8. Cost per student completing program. (#7 divided by grant amount) 9. Average student-to-teacher ratio. (Community College reports) 10. No. Students moving up one grade level. (Test scores) 11. % of students who meet individual training plan goals. (Employee reports) 12. % of students who reported that they received benefit from program. (Existing client survey) 13. % of students who take vocational training classes within 4 months of program completion. (New client survey) 14. % of students who successfully complete vocational skills training within 1 year of program completion. (New client survey) 15. % of students who obtain employment within 1 year of program completion. (New client survey) 16. % of students who retain employment within 2 years of program completion. (New client survey) 17. % of students who leave social assistance programs within 1 year of program completion. (Agency records) Which Measure Should You Use? Accurate ? Relevant ? Agency Mission: Provide educationally disadvantaged adults with basic literacy, arithmetical, and other life skills training in order to help them gain access to vocational skills training and job opportunities. Practical ?

More Related