1 / 28

Desktop Metrics: What Should We Measure?

Desktop Metrics: What Should We Measure?. John McDermon Group Leader DCS-2 Departmental Computing Services Division. LA-UR-09-03076. Three questions. What behavior do you want to promote? What story do you want to tell? Who do you want to tell this story to?. Behavior.

jacie
Download Presentation

Desktop Metrics: What Should We Measure?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Desktop Metrics:What Should We Measure? John McDermon Group Leader DCS-2 Departmental Computing Services Division LA-UR-09-03076

  2. Three questions • What behavior do you want to promote? • What story do you want to tell? • Who do you want to tell this story to?

  3. Behavior • You don’t get what you want, you get what you measure1 • Beware of unintended consequences • 3 traditional vectors • Better • Quality, Accuracy, Efficacy • defects, repeats • Faster • Timeliness, Efficiency • response time, resolution time • Cheaper • Total cost, Cost per unit • define costs, define units • Where does quantity fall? 1. <http://www.whitehouse.gov/omb/expectmore/Council_for_Excellence_in_Government_ExpectMore_Booklet.pdf>

  4. Story • Purpose • Inform • Explain • Persuade • Timeframe • Past • “Once upon a time…” • Present • Real time – dashboard, HUD • Future • “Gaze into our crystal ball…”

  5. Audience • Customers • Upper management • Internal management • Competitors • Peers • Critics • Others?

  6. Other things to consider • Leading or Lagging • Trends or Spot values • Cost and difficulty of acquiring data • What data are you collecting now? • What are your current systems capable of? • Frequency

  7. Discussion • Example 1: • Faster • Explain about future • Customers • Example 2: • Cheaper • Persuade about past • Upper management • Example 3: • Better • Inform about present • Peers

  8. LANL Environment • ≈ 12,000 employees • all types, including students • 40+ square miles • ≈ 30,000 network devices • ≈ 15,000 fingerprint as WIN • ≈ 6,000 other Operating Systems • + printers, switches, etc. • ≈ 3,000 standalone systems

  9. Departmental Computing Services • 5 Groups ≈ 280 staff • 1 Central Services Group ≈ 30 staff • 4 Field Groups ≈ 250 staff • Support over 80% of LANL Departmental Computing assets • Includes: • Standards, Tools & Services • Call center • Electronic Software Distribution (ESD) • Departmental servers and services • Does not include: • Network or Phones • Enterprise application development or operations

  10. Volume of work FY04 69,230 FY05 78,483 FY06 112,060 FY07 116,805 FY08 120,132 FY09 68,042

  11. Service Level Agreement • High – 80% resolved in 4 work hours • Medium – 80% resolved in 3 work days (24 work hrs)

  12. FY08 High (4 hrs)

  13. FY08 Medium (3 days)

  14. FY09 High (4 hrs)

  15. FY09 Medium (3 days)

  16. Workload Tracking • Tickets Created • Tickets Closed • Tickets in Queue

  17. FY07 Ticket Queue

  18. FY08 Ticket Queue

  19. FY09 Ticket Queue

  20. FY07 – FY09 Ticket Queue

  21. Attempt to show age of queue

  22. Customer Satisfaction • Every ticket closed generates a survey email • 3 vectors • Promptness • How satisfied were you with the promptness and efficiency of our desktop support service? • Accuracy • How satisfied are you that your request was completed accurately without creating other problems? • Professionalism • How satisfied are you with the courtesy, knowledge and experience of our staff? • 5 point scale (-2 to +2) • Very Dissatisfied = -2, Neutral = 0, Very Satisfied = 2

  23. Survey Response Rate

  24. Survey Scores

  25. Research: quantifying an environment • Assertion: • Complexity increases cost of support • Examples of complexity • Number of systems • Number of operating systems and versions • Number of applications and versions • Classified (complexity of security plan) • Servers, dual boot, etc. • Method of access (network vs. stand alone) • How to quantify complexity? • Could these be variables in linear equation?

  26. Harvey Mudd Clinic A MULTI-CRITERIA OPTIMIZATION MODEL FOR TRADE-OFFS BETWEEN SERVICES AND COSTS IN COMPUTER SUPPORT SERVICES This project shall deliver a decision analysis model that relates the cost of support to the types of services available. It will include a multi-criteria optimization model for making trade-offs between level of service provided and costs of that service. The cost of support will be modeled as a function of actual costs (e.g. salaries, cost of systems, cost of equipment, software, etc) and time costs (actual costs in the form of FTE's required to support a given environment, but also including training and professional development costs). The model will permit comparisons amongst different environments so that choices between technologies and level of support can be made by the users.

  27. Harvey Mudd Clinic (cont.) To estimate the number of technicians required at each skill level, we anticipate developing a multi-class queueing model where each "server" (technician) is capable of serving different classes of customers based on their skill set. Statistical analysis will be required to estimate the distribution of the time between service requests of each type (e.g. the arrival rate and inter-arrival time distribution) and the distribution of service times; these would be the inputs to the queueing model. Queueing theory as well as computer simulations will be used to estimate the correct number of technicians at each skill level. This queueing model in turn will provide a simple "rule-of-thumb" metric relating environment "complexity" and number of technicians. This metric will be obtained by fitting a statistical model to the results of the queueing simulation to infer which factors have the greatest impact on number of FTE's required.

  28. Questions? Contact: John McDermon jmcdermo@lanl.gov 505-667-7315

More Related