How to benchmark applications development or maintenance theory and practice david g rogers
1 / 14

How To Benchmark Applications Development or Maintenance: Theory and Practice David G. Rogers - PowerPoint PPT Presentation

  • Uploaded on

How To Benchmark Applications Development or Maintenance: Theory and Practice David G. Rogers. How to benchmark Apps, in a nutshell. Sponsor the benchmark at a senior level Understand the risks, costs and timescales Be actively involved in the benchmark

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about ' How To Benchmark Applications Development or Maintenance: Theory and Practice David G. Rogers' - preston-house

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

How to benchmark apps in a nutshell
How to benchmark Apps, in a nutshell

  • Sponsor the benchmark at a senior level

  • Understand the risks, costs and timescales

  • Be actively involved in the benchmark

    • Passive benchmarking (“Speak when you’re spoken to”) is bad for your health

  • Cater for the measurement problems unique to Applications

    • Watch the UKSMA website – launching initiative to solve major AM measurement problem

  • Plan round the comparison problems





David G. Rogers

Senior sponsorship
Senior sponsorship

  • In-house –

    • IT director

  • Outsourced –

    • Customer’s IT director (not just contract manager)

    • Supplier’s relationship and delivery managers (not just contract manager)

  • Give the benchmark the level of management commitment warranted by the risks

David G. Rogers

The risks
The Risks

  • The result might ruin formerly win-win relationships, damage careers, cost many jobs

    • Relationship of the Applications service supplier with its customers is at risk, whether in-house or outsourced

    • What will senior management do if a benchmark result says: “Your AM costs 3 times the market average”? Or “ … 1/3 the market average”?

    • First, they will decide whether they believe it.

      • If they don’t, reputations are damaged

      • If they do …

David G. Rogers

The risks1
The Risks

  • The result might be wrong

    • Mistakes abound

      • Not necessarily (but possibly) by the benchmarker

      • The quality of the benchmark is your responsibility. Don’t delegate all responsibility for quality to the benchmarker

      • Watch the detail

        • Check all data going into the process

        • Ensure all services and all costs are reported(this may seem obvious, but …)

        • Ensure in writing that you have the right to check for possible arithmetic errors by the benchmarker (they are only human)

      • Build in cross-checks where possible

David G. Rogers

The costs
The Costs

  • Major cash costs in benchmarking Applications:

    • Benchmarker’s fee

    • FP counting costs could easily be higher

  • Staffing

    • One senior (reporting to the Sponsor) manager responsible

    • Full-time benchmark manager

    • System experts when required

David G. Rogers


  • Only passive benchmarks keep to the benchmarker’s schedule

    • In a “Passive benchmark” you:

      • Do only what the benchmarker tells you

      • Supply only the information you are asked for

      • Sit back and wait for The Answer

    • Passive benchmarking is bad for your health!

David G. Rogers

Some other stuff you must get right
Some other stuff you MUST get right

  • Objectives

    • Crucial but usually easy if outsourced

      • Primary reason for outsourcing: 48% say “Reduce cost”

        • (that explains a lot … imagine recruiting senior executives on the same principle …)

    • Crucial but slippery if in-house

  • Like-for-like comparisons

    • Very hard to achieve … you have to help the benchmarkers

  • Releases

    • Very hard to match output to input … don’t leave it all to the benchmarker

David G. Rogers

Application maintenance
Application Maintenance

  • The key metric: £ / FP maintained

  • Commercially crucial measurement

    • (see Risks above!)

David G. Rogers

How do you obtain fp
How do you obtain £ / FP?

  • £ : the price to you of running AM

  • FP : the size of the maintained portfolio

    How is FP obtained?

  • Count the FPs: +/- 7.5%, but usually much too expensive

  • “Fast counts” etc: less accurate (+/- 20% or more)

    • Too inaccurate if results are commercially important

  • Much used: BACKFIRINGCount Source Lines Of Code (SLOC), and “backfire” to FPs using average ratios

David G. Rogers

How accurate is backfiring
How accurate is backfiring?

  • In one recent benchmark, benchmarker claimed +/- 10%

  • Most experts say +/- 100% to 400%

  • The experts differ – but if the latter,benchmarking represents a HUGE commercial risk …

  • … so it is financially important to find out

David G. Rogers

Initiative launched to accumulate proof
Initiative launched to accumulate proof

  • Nothing even remotely confidential:only size matters

    • No dates, times, costs, prices, regions

    • No names except the verifying UKSMA member

  • The result will bear the imprimatur of UKSMA, and will be in the public domain

David G. Rogers

The desired outcome
The desired outcome

  • The usefulness of SLOC as a metric for use in benchmarks will be finally and permanently quantified


David G. Rogers

Primary contact for this UKSMA initiative:[email protected]+44 7812 189 672Any questions, suggestions or offers of data?David Rogers is an EDS employee, but in this initiative is acting solely on UKSMA’s behalf.

David G. Rogers