Software effort estimation
This presentation is the property of its rightful owner.
Sponsored Links
1 / 34

Software Effort Estimation PowerPoint PPT Presentation


  • 68 Views
  • Uploaded on
  • Presentation posted in: General

Software Effort Estimation. Planning to Meet Schedule. Lewis Sykalski 5/01/2010. How Managers View Software Effort Estimation. Nebulous Vague No correlation to reality Why bother. Effects of Bad/No Estimation. Underestimation can lead to Schedule/Cost Over-runs:

Download Presentation

Software Effort Estimation

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Software effort estimation

Software Effort Estimation

Planning to Meet Schedule

Lewis Sykalski

5/01/2010


How managers view software effort estimation

How Managers View Software Effort Estimation

  • Nebulous

  • Vague

  • No correlation to reality

  • Why bother


Effects of bad no estimation

Effects of Bad/No Estimation

  • Underestimation can lead to Schedule/Cost Over-runs:

    • Diminished Confidence by upper management

    • Customer upset

    • Can affect schedules/cost downstream

  • Overestimation can lead to Schedule/Cost Under-runs:

    • Adversely affect positioning

    • De-motivator for employees

    • More profitable opportunities passed over


Effect of estimation

Effect of Estimation

  • Clearer expectation of level of effort

  • Allows SPM to better allocate resources

  • Helps SPM account for change

  • Shows upper-level management that manager has a plan


Abstract

Abstract

Most managers today are unaware of established software effort estimation methodologies or don’t account for unforeseen consequences when using a method.

This paper attempts to reconcile this by surveying several effort estimation approaches and gauging both the utility and inherent pitfalls in each.

Additionally, this paper will present a refined method for software effort estimation based on expert judgment and describe benefits of using said method.


Current methodologies

Current Methodologies

  • Expert Judgment

  • Consensus-based Estimation

  • Price-to-Win

  • Analogy Costing

  • Function Point Analysis

  • Algorithmic Models

    • Cocomo

    • SLIM

    • PriceS


Expert judgment

Expert Judgment

  • Consulting one or more experts

  • Experts leverage their own past experiences or methods

  • Typically arrive at task duration. Sometimes size which can be converted to effort with assumed productivity.

  • Sometimes averaging occurs between multiple experts’ estimates to smooth out results


Expert judgment utility

Expert Judgment Utility

Advantages:

  • Not much time/effort required

  • Not conceptually difficult

    Disadvantages:

  • Not repeatable or explicit

  • No consistent rationale for estimates

  • Prone to error and very subjective

  • If estimates do not match size of experts’ historical experiences, estimate can be way off-based

  • Experts with right experience could be hard to find


Consensus based estimation

Consensus-based Estimation

  • Logical extension of expert judgment

  • Multiple experts/developers seek to reach consensus on estimate

  • Wideband-Delphi most popular:

    • Short discussion of to define tasks

    • Secret ballots

    • Any deviations must be resolved by discussion & revote

  • Planning Poker (Agile Method)

    • Card representing duration of task is shown

    • Resolution follows until consensus is reached


Consensus based estimation utility

Consensus-Based Estimation Utility

Advantages:

  • Same advantages as parent -- Expert Judgment

  • Experts have discussed and agreed on the estimate

  • Hidden tasks are often discovered

    Disadvantages:

  • Same disadvantages as parent – Expert Judgment

  • Amount of time required to reach consensus

  • “Anchoring”: When process is loosened and someone affects groups predispositions. (e.g. “It can’t be more than 20 days…”)


Price to win

Price-to-Win

  • Estimate is estimated at:

    • whatever the optimum value is in order to win the contract or

    • whatever funds or time the customer has available


Price to win utility

Price-to-Win Utility

Advantages:

  • Win the contract

  • Effort could contract to fill the difference?

  • Not a lot of time required

    Disadvantages:

  • Considered poor practice

  • Large discrepancies in effort anticipated and effort required – might result in severe overruns

  • Quality of the product may suffer in trying to reach deadline

  • Profit loss?


Analogy costing

Analogy Costing

  • Estimates effort by analogizing to past project(s)

  • ∆Effort = difference in project from past project(s) in terms of requirements, reuse opportunity, process, etc.


Analogy costing utility

Analogy Costing Utility

Advantages:

  • Grounded in past history

  • Full effort need not be estimated only delta

  • Not a lot of time required (just meaningful analysis of deltas)

    Disadvantages:

  • Meaningful data may not be present

  • Subjectivity in deltas

  • Past historical data may not be representative (innovation efforts)

  • Abnormal conditions in past projects (that estimator is unaware of) may throw off results


Function point analysis albrecht

Function Point Analysis (Albrecht)

  • Function point represents a piece of functionality of the program:

    • User-input

    • User-output

    • Inquiry (interactive inputs requiring response)

    • External (files shared or used externally)

    • Internal (files shared or used internally)


Function point analysis albrecht1

Function Point Analysis (Albrecht)

where,

  • i=type of FP (User-input, output, etc)

  • j=Complexity level of FP (1-3)

  • Nij is the number of FPs of type ij

  • Wij is the weight of the FPs of type ij

    where a & b come from historical data/curve-fitting

    Effort = LOC*Productivity


Function point analysis utility

Function Point Analysis Utility

Advantages:

  • Can be formulated at requirement time very early in the software-lifecycle

  • Provides good traceability in mapping tasks to effort

  • No subjectivity is required as to the task duration

  • Less prone to subjective bias than expert judgment based techniques

    Disadvantages:

  • Detailed requirements & consensus on complexity is required

  • Very nebulous

  • Time involved to arrive at such an estimate

  • Requires a good handle up-front of the requirements (prone to requirements creep/hidden requirements)


Algorithmic models

Algorithmic Models

Where,

  • {C1, C2, …, Cn} denote cost factors

  • F represents the function used


Cocomo boehm

COCOMO (Boehm)

  • Regression Formula

  • Historical Data Inputs

  • Current Project Characteristics: (Nominal 1.0)

    • Product: Reliability, Documentation Needs, etc.

    • Computer: Performance Constraints, Volatility, etc

    • Personnel: Capability, Familiarity w/language, etc

    • Project: Co-location, Tools productivity gains, etc

  • Project Categorization: (Different historical data)

    • Organic: Small team / flexible process

    • Embedded: Large team / tight process

    • Semi-Detached: Somewhere in-between


Slim putnam

SLIM (Putnam)

  • Spent much time doing curve fitting, came up with following equation:

    where,

  • Size is ESLOC or Effective SLOC (new+modified)

  • B is a scaling factor indicative of project size

  • Productivity is the Process Productivity factor

  • Time is duration of project schedule (years)


Prices

PriceS

  • Parametric cost-estimation system

  • Can accept a wide variety of inputs:

    • Use-Cases, Function Points, Predictive Object Points, Functional Size, SLOC, and Fast Function Points.

  • Effort estimates also factor in:

    • Software Lifecycle processes (Waterfall, Agile, etc.)

    • Software Language Choice (Java, C++, etc).


Algorithmic models utility

Algorithmic Models Utility

Advantages:

  • Objective

  • Repeatable results

  • Historical data often represents MANY past projects

    Disadvantages:

  • Complex formulas are hard to comprehend

  • Requires faith on the users’ end

  • Subjective historical data

  • Subjective Cost factors may throw off equations

  • May require difficult/time consuming tailoring using estimator’s historical data


New model predictive expert judgment

New Model: Predictive Expert Judgment

Combines inherent simplicity of expert judgment method w/feedback control provided for in other models

Requires:

  • Diligent tracking of actual times for past tasks

  • Records of experts’ estimates toward those tasks.


Predictive expert judgment cont

Predictive Expert Judgment (cont.)

Steps:

  • Solicit effort estimates for each task from each expert

  • As tasks are completed build a repository tracking how close the experts’ estimate was to past historical estimates

  • Weight each experts’ future estimate by how well he has historically estimated


Predictive expert judgment equation

Predictive Expert Judgment Equation

Where,

  • Wi corresponds to each expert’s trust weight based on historical performance

  • Ei corresponds to each experts’ estimate for the current task being estimated


Predictive expert judgment cont1

Predictive Expert Judgment (cont.)

  • Wi can be calculated:

    • Simple formula involving standard deviations

    • Intermediate custom formula where one disregards some experts based if they are outside target range or based on external factors

    • Weighted Variance Equation


Simple formula

Simple Formula

Where,

  • Wi corresponds to each expert’s trust weight based on historical performance

  • N is the number of experts with historical data

  • σi is the current experts’ historical stdev

  • σn is each experts’ historical stdev


Expert judgment example

Expert Judgment Example

Expert A: 40 hours

Expert B: 20 hours

Expert C: 5 hours

Expert D: 20 hours

Expert E: 30 hours

No historical data:

40*0.2+20*0.2+5*0.2+20*0.2+30*0.2 =23.0 hours


Predictive expert judgment example

Predictive Expert Judgment Example

Everybody counts methodology…

0.35*40+0.16*20+0.11*5+0.22*20+0.17*30=27.0hours


Predictive expert judgment w constraints

Predictive Expert Judgment – W/Constraints

If we had rules where we threw out experts’ estimates if they were wildly off: > 12.0 STDEV

0.47*40+0.29*20+0.23*30=31.8hours

(Where σi <12.0)

(Could be closer?)


Predictive expert judgment w constraints cont

Predictive Expert Judgment – W/Constraints (cont.)

You could alternatively tighten the standard deviation constraint to trust only the leading expert…

1.0*40=40.0hours

(Where σi = best)


Predictive expert judgment w constraints cont1

Predictive Expert Judgment – W/Constraints (cont.)

You could also adjust for deviations in estimate (how far they are normally off and in what direction)

0.35*38.75+0.16*20+0.11*3.75+0.22*13.75+0.17*26.25=24.4hours


Results analysis

Results & Analysis

  • No model can estimate the cost of software with a high degree of accuracy

  • Expert Judgment was found to be just as good as algorithmic models (in 15 case studies)

  • Uncertainty and Probability should be added to most models

  • More historical data needs to be collected from industry


Conclusion

Conclusion

  • A software manager who takes time to perform and reconcile software effort estimation will be on safer footing than a manager who doesn’t

  • Use the advantages/disadvantages in paper and use one you feel most comfortable with


  • Login