slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Agenda PowerPoint Presentation
Download Presentation
Agenda

Loading in 2 Seconds...

play fullscreen
1 / 23

Agenda - PowerPoint PPT Presentation


  • 302 Views
  • Uploaded on

A Neuro-Fuzzy Model with SEER-SEM for Software Effort Estimation Wei Lin Du, Danny Ho*, Luiz F. Capretz Software Engineering, University of Western Ontario, London, Ontario, Canada * NFA Estimation Inc., Richmond Hill, Ontario, Canada November 2010. Agenda. Purpose SEER-SEM NF SEER-SEM

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Agenda' - seanna


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

A Neuro-Fuzzy Model with SEER-SEM for Software Effort Estimation Wei Lin Du, Danny Ho*, Luiz F. CapretzSoftware Engineering, University of Western Ontario, London, Ontario, Canada* NFA Estimation Inc., Richmond Hill, Ontario, CanadaNovember 2010

agenda
Agenda
  • Purpose
  • SEER-SEM
  • NF SEER-SEM
  • Evaluation
  • Conclusion
purpose
Purpose
  • Integrate neuro-fuzzy (NF) technique with SEER-SEM
  • Evaluate estimation performance of NF SEER-SEM versus SEER-SEM
agenda4
Agenda
  • Purpose
  • SEER-SEM
  • NF SEER-SEM
  • Evaluation
  • Conclusion
seer sem
SEER-SEM
  • SEER-SEM was trademarked by Galorath Associates, Inc. (GAI) in 1990
  • Effort estimation is one of the SEER-SEM algorithmic models

Size

Effort

Personnel

Cost

SEER-SEM

Estimation

Processing

Environment

Schedule

Complexity

Risk

Constraints

Maintenance

seer sem effort estimation
SEER-SEM Effort Estimation
  • Software Size
    • Lines, function points, objects, use cases
  • Technology and Environment Parameters
    • Personal capabilities and experience (7)
    • Development support environment (9)
    • Product development requirements (5)
    • Product reusability requirements (2)
    • Development environment complexity (4)
    • Target environment (7)
seer sem equations
SEER-SEM Equations

where:E Development effort

K Total lifecycle effort including development and maintenance

Se Effective size

D Staffing complexity

Cte Effective technology

Ctb Basic technology

agenda8
Agenda
  • Purpose
  • SEER-SEM
  • NF SEER-SEM
  • Evaluation
  • Conclusion
slide9

Preprocessing

Neuro-Fuzzy

Inference

System

(PNFIS)

RF1

ARF1

NFB1

FM1

Algorithmic Model

RF2

Output Metric

Mo

ARF2

FM2

NFB2

FMN

ARFN

RFN

NFBN

NFA

USA Patent No. US-7328202-B2

where N is the number of contributing factors,

M is the number of other variables in the Algorithmic Model,

RF is Factor Rating,

ARF is Adjusted Factor Rating,

NFB is the Neuro-Fuzzy Bank,

FM is Numerical Factor/Multiplier for input to the Algorithmic Model,

V is input to the Algorithmic Model,

and Mo is Output Metric.

slide10

Layer3

Layer4

Layer5

Layer1

Layer2

FMPi1

w1

N

Ai1

Ai2

N

ARFi

FMi

FMPi2

AiN

N

wN

FMPiN

NFB

where ARFi is Adjusted Factor Rating for contributing factor i,

is fuzzy set for the k-th rating level of contributing factor i,

is firing strength of fuzzy rule k,

is normalized firing strength of fuzzy rule k,

is parameter value for the k-th rating level of contributing factor i,

and is numerical value for contributing factor i.

nf seer sem
NF SEER-SEM

Size, SIBR

Effort Estimation

SEER-SEM

Effort Estimation

Software Estimation

Algorithmic Model

P1

ACAP

NF1

P2

AEXP

NF2

P34

Complexity (Staffing)

NFm

agenda12
Agenda
  • Purpose
  • SEER-SEM
  • NF SEER-SEM
  • Evaluation
  • Conclusion
performance metrics
Performance Metrics
  • Relative Error (RE)

= (Est. Effort – Act. Effort) / Act. Effort

  • Magnitude of Relative Error (MRE)

= |Est. Effort – Act. Effort | / Act. Effort

  • Mean Magnitude of Relative Error (MMRE)

= (∑MRE) / n

  • Prediction Level (PRED)

PRED(L) = k / n

mmre results
MMRE Results

Negative value of MMRE change means improvement

pred results
PRED Results

Positive value of PRED change means improvement

summary of evaluation results
Summary of Evaluation Results
  • MMRE is improved in all cases, with the greatest improvement over 25%
  • Average PRED(100%) is increased by 12%
  • NF SEER-SEM improves MMRE by reducing large MREs
agenda19
Agenda
  • Purpose
  • SEER-SEM
  • NF SEER-SEM
  • Evaluation
  • Conclusion
conclusion
Conclusion
  • NF with SEER-SEM improves estimation accuracy
  • General soft computing framework works with various effort estimation algorithmic models
future directions
Future Directions
  • Evaluate with original SEER-SEM dataset
  • Evaluate general soft computing framework with:
    • more complex algorithmic models
    • other domains of estimation