slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Convener: Houman Younessi PowerPoint Presentation
Download Presentation
Convener: Houman Younessi

Loading in 2 Seconds...

play fullscreen
1 / 76

Convener: Houman Younessi - PowerPoint PPT Presentation


  • 91 Views
  • Uploaded on

Software Engineering Management. Course # CISH-6050. Lecture 4: . Software Process & Project Management Part 1. Convener: Houman Younessi. 06/04/2012. AGENDA. SW-CMM Level 2: Software Process & Project Management Requirements Management Project Tracking & Oversight

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Convener: Houman Younessi' - etta


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Software Engineering

Management

Course # CISH-6050

Lecture 4:

Software Process & Project Management Part 1

Convener:

Houman Younessi

06/04/2012

1

slide2

AGENDA

  • SW-CMM Level 2: Software Process & Project Management
    • Requirements Management
    • Project Tracking & Oversight
      • Risk Management
    • Project Planning
    • SQA
    • Software Configuration Management
    • Sub-contract Management

2

slide3

Software Project Management

  • Software Project Management includes:
    • Carrying out the definition of a job to be completed
    • Completing a plan to get a job done
  • Foundation:
    • Commitments are made to get the job done
    • Plans, estimates, reviews & tracking systems support those commitments

3

slide4

Software Project Management …

  • Balance between getting product “out the door” & maintaining the organization’s long-term capability
  • Need Sr. Management commitment to ensure that a proper project management system is in place and followed

4

slide5

Software Project Management …

  • Questions to be answered to develop effective software development plan:
    • Why is system being developed?
    • What will be done, by when?
    • Who is responsible for each function?
    • Where are they organizationally located?
    • How will the job be done technically & managerially?
    • How much resource is needed?

5

slide6

Software Project Management …

  • Effective Project Management focuses on the 4 P’s:
    • People – Motivated, highly skilled
    • Product – Objectives & scope
    • Process – Framework for development
    • Project – Planned and controlled

6

slide7

Requirements Management

  • As project progresses and customer looks closer at the problem/solution, they generally identify “changes”
  • Requirements must be managed in order to preserve project plan, schedule, milestones, etc.
  • Manage Scope Creep

7

slide8

Requirements Management …

  • Traceability tables can be created to help manage requirements:
    • Features Traceability
    • Source Traceability
    • Subsystem Traceability
    • Interface Traceability

8

slide9

Requirements Management …

  • Requirements Engineering:
    • Requirements Definition – natural language statement of the services system will provide
    • Requirements Specifications – Structured document identifying system services
    • Software Specifications – Abstract definition of software; basis for design & implementation

9

slide10

Requirements Management …

  • Requirements Engineering Stages:
    • Feasibility Study
    • Requirements Analysis
    • Requirements Definition
    • Requirements Specification

10

slide11

Requirements Management …

  • Requirements Document:
    • Combination of requirements definition and requirements specifications
    • NOT a design document – What, not How
    • Addresses:
      • External system behavior
      • Implementation constraints
      • Easy to change
      • Serves as a reference tool for system maintainers

11

slide12

Requirements Management …

  • Requirements Validation:
    • Verify requirements do what customer wants system to do
    • Validity: further analysis of requirements might identify additional function
    • Consistency: Ensure requirements don’t conflict with one another
    • Completeness: Satisfies customer needs
    • Realism: Make sure requirements can be realized

12

slide13

Requirements Management …

  • Requirements Evolution:
    • Refining requirements = better understanding of user’s needs
    • Process feeds information back to user, which can cause requirements to change
    • Evolving Requirements:
      • Enduring – Stable
      • Volatile – Likely to change

13

slide14

Requirements Management …

Some observed Requirements Pitfalls to avoid

14

slide15

Avoiding Requirement Pitfalls

  • Reference Point:
    • Steve McConnell, Rapid Development: Taming Wild Software Schedules, Microsoft Press, Redmond WA, 1996
    • Chapter 14: Feature Set Control
    • Utilized most approaches suggested for controlling function – some worked, some didn’t
    • Most serious problem: scope creep

15

slide16

Avoiding Requirement Pitfalls …

  • Minimum Specification:
    • On larger projects, Analysts (unintentionally) used vague statements or left specific requirement details for programmer’s interpretation
    • In the correct situation, minimum specifications can help, but not in this case
    • Clearly identify function being requested
    • Never assume developer has same level of application knowledge as analyst

16

slide17

Avoiding Requirement Pitfalls …

  • Requirements Scrubbing:
    • Usually more function requested than development schedule will allow
    • Requirements are scrubbed to eliminate function or offer alternative (cheaper) solutions
    • Approach works to maintain schedule, but doesn’t satisfy customer
    • Avoid scrubbing requirements to extent they don’t meet customer needs

17

slide18

Avoiding Requirement Pitfalls …

  • Versioned Development:
    • Use this approach when customer won’t allow function to be eliminated, but all function can’t be contained in given schedule
    • Pursue a phased/staged approach for delivering function
    • Caution – Ensure the additional phases are scheduled and implemented!

18

slide19

Avoiding Requirement Pitfalls …

  • Feature Creep Control:
    • Attempt to strictly enforce change management during development
    • We’ve done this well on some projects and not so well on other projects
    • Customer identifies change late in the cycle that must be done or else they won’t accept product!
    • Avoid labeling design changes or new features as problems, so they get fixed

19

slide21

Project Tracking & Oversight

  • Requirement for sound project management is ability to determine project status
  • Planning process includes schedule with checkpoints (milestones)
  • Tools for creating project schedules
    • Microsoft Project
    • ABT Project Workbench
    • Spreadsheets

21

slide22

Project Tracking & Oversight …

  • Project Schedule provides roadmap for project mgr to manage project
  • Project Schedule defines tasks and milestones to be properly tracked and controlled
  • Tracking done through:
    • Status mtgs, evaluating results of reviews, tracking project milestones, comparing actual dates against plan dates, verifying time spent on tasks, etc.

22

slide23

Project Tracking & Oversight …

  • Project Plan:
    • Provides baseline cost and schedule
    • Brief document addressed to diverse audience
    • Not static – updating risks, estimates, schedules, etc.
    • Communicates scope & resource
    • Defines risks and risk mgmt techniques
    • Outlines how quality ensured and change managed

23

slide24

Project Tracking & Oversight …

  • Risk Management:
    • Reactive vs. proactive risk strategies & management
      • Crisis management & fire fighting will jeopardize project
      • Risk needs to be proactively managed throughout life of project

24

slide25

Project Tracking & Oversight …

  • Types of Risks:
    • Project Risks – threaten project plan
    • Technical Risks – threaten timeliness & quality; can it be implemented?
    • Business Risks – threaten validity of software being built; may jeopardize project

25

slide26

Project Tracking & Oversight …

  • Types of Business Risks:
    • Building excellent product no one wants
    • Building product that no longer fits into business strategy
    • Building a product that the sales force doesn’t understand how to sell
    • Losing support of senior management
    • Losing budget or personnel commitment

26

slide27

Project Estimation

  • Two aspects of Project Estimation
    • Effort
    • Schedule
  • Software Estimation needed to determine:
    • How big the project is (effort)
    • How much it will cost to complete
    • How long it will take to complete (schedule/duration)

27

slide28

Project Estimation …

  • Are highly precise estimates really needed, vs. reasonable estimates?
  • Estimates become self-fulfilling prophecy
    • Schedules derived from estimates
  • Can’t precisely determine if estimates were correct
    • “Work expands to fill available time”

28

slide29

Project Estimation …

  • Facts about Estimating from R. L. Glass Facts and Fallacies of Software Engineering
    • Poor estimation is one of the two most common causes of runaway projects
      • Estimates are really wishes vs. realistic targets
      • Shortcuts taken to make targets
      • Problems with estimation techniques: experts, algorithmic approaches, LOC, FP

29

slide30

Project Estimation …

  • Software estimation usually occurs at the wrong time:
      • Software estimates usually done at the very beginning of a project
      • To make meaningful estimate, need to know a lot about the project
      • First phase of project is requirements gathering, so total facts about project not known yet
      • Estimating solution time & cost while total problem isn’t understood

30

slide31

Project Estimation …

  • Software estimation is usually done by the wrong people:
      • Software estimates should be done by folks who build the software – programmers, project managers, etc.
      • Corporate “politics” – estimation done by senior management, marketing organization, customers, and user
      • Wishes vs. reality

31

slide32

Project Estimation …

  • Software estimates are rarely corrected as the project proceeds:
      • As the project proceeds and more information is known about the project, estimates aren’t adjusted
      • Developers pursue achieving the original estimates; upper mgmt not interested in revising estimates
      • Project results usually measured against the first estimates

32

slide33

Project Estimation …

  • Software estimates are faulty, but everyone is concerned when they are not met:
      • Given how inaccurate estimates can be and not adjusted during project, should estimates be treated as relatively unimportant?
      • Instead, software projects are always managed by schedule
      • Other ways to manage project success or failure, rather than just by schedule

33

slide34

Project Estimation …

  • Disconnect between management and their programmers:
      • Research study: project failed to meet estimates – management felt project was failure; developers thought it was a success
      • 419% over budget; 193% over schedule; 130% over original size estimate
      • Project was completed; did what it was suppose to; no post-release defects

34

slide35

Project Estimation …

  • The answer to a feasibility study is always ‘Yes’:
      • No new problem is too tough to solve
      • Optimism – believe we can produce error free code very quickly
      • Reality – error-removal phase takes longer than analysis, design, and code
      • When feasibility study done, often proceed with project because we feel it can be done; find out too late that it couldn’t be done

35

slide36

Project Estimation …

  • Fallacy: To estimate cost & schedule, first estimate LOC:
      • Evolved over the years - notion of using LOC to estimate size
      • LOC then converted to cost & schedule
      • Fallacies with most popular method?
        • COBOL LOC = to C++ LOC?
        • Mathematic vs. business LOC?
        • Junior programmer vs. experienced

36

slide37

Project Estimation …

  • Some other thoughts on Software Estimation:
    • All system attributes affect one another
    • Reach goals in one area by sacrificing others
    • Design to cost vs. attempting to cost a design
    • Understand quality requirements to estimate cost
    • Past project data useful; current project data better

37

slide38

Cost Estimation

  • Four Techniques for estimating effort and schedule:
    • Expert Opinion
    • Analogy
    • Decomposition
    • Models

38

slide39

Cost Estimation …

  • Expert Opinion:
    • Utilizes mature developer’s experiences
    • Parameters of project described and experts make predictions based on past experiences
    • Expert may use tools, models, or other methods to generate estimates
    • Strength of estimates relies on expert and their breadth of experience

39

slide40

Cost Estimation …

  • Analogy:
    • Formal, more visible approach to expert opinion
    • Compare proposed project with one or more past projects
    • Similarities & differences in projects identified; differences used to adjust effort
    • Estimators describe project in terms of key characteristics

40

slide41

Cost Estimation …

  • Decomposition:
    • Thorough analysis of project characteristics that affect cost
    • Focus on products being delivered or tasks required to build software
    • Described in smallest components / tasks, which are estimated
    • For project estimate, low-level estimates are summed or used with compositional rules for complexity

41

slide42

Cost Estimation …

  • Models:
    • Uses techniques that identify key contributors to effort, generating mathematical formulas
    • In addition to size, may include experience of team, language, degree of code reuse, etc.
    • Models usually based on past experience and may require some decomposition

42

slide43

Cost Estimation Models …

  • Models:
    • Most organizations prefer Models or decomposition vs. expert opinion or analogy
    • Two types of models to estimate effort:
      • Cost Models – provide direct estimates of effort or duration. Ex: COCOMO
      • Constraint Model – relationship over time between 2 or more parameters of effort, duration or staffing. Ex: Putnam

43

slide44

Cost Estimation …

  • Each of the 4 techniques can be applied in one of two ways:
    • Bottom-up estimation
      • Estimates done at the lowest-level parts or tasks
      • Similar to decomposition, but applies to analogy, expert opinion, & models
    • Top-down estimation
      • Full estimate made for overall process or product
      • Estimates for components calculated

44

slide45

Software Measurement History

  • Ground work for software measures and measurement, including estimation, was established in the 1960’s and mainly in the 1970’s
  • Work continues in this area
  • Most software specialist agree that higher reliability is achieved when software systems are highly modularized & structure kept simple

45

slide46

Software Measurement History …

  • LOC is earliest software measure
    • Used in 1955 to analyze size of first FORTRAN compiler
  • SLOC (Source Lines of Code) in 1960’s were counted by the number of 80-column cards (physical lines of code)
  • McCabe’s Measure - 1970’s: minimum # of paths in flowgraph

46

slide47

Software Measurement History …

  • Halstead Measures – 1970’s: based on source code of program; effort can be expressed as function of operator count, operand count, or usage count
  • Ruston Measures – 1981: describes program flowchart by means of a polynomial; suitable for network measurement, not as popular as McCabe

47

slide48

Software Measurement History …

  • Estimation Models:
    • Delphi 1966
    • RCA Price-S System 1976
    • Putnam’s SLIM Model 1978
    • Function-Point Method 1979
    • COCOMO Model 1981
    • Bailey and Basili 1981
    • Mark II Function Points 1988
    • Pfleeger Model 1989
    • COCOMO 2.0 Model 1996

48

slide49

Software Estimation Models

  • Price To Win
    • Low Bid or First to Market
    • Under bid the competition to get contract
    • Figure out later (after have the contract) how to meet the cost, schedule, and effort
  • SPQR
    • Software Productivity, Quality, and Reliability Model
    • Produced by Capers Jones
    • Based on 45 factors influencing cost & productivity

49

slide50

Software Estimation Models: Delphi

  • Delphi
    • Based on iterative expert opinion
    • Requires domain and organizational expertise
    • Experts use analogies from past experiences
    • Improves with low level decomposition
    • Maybe used for new or unprecedented systems

50

slide51

Software Estimation Models: Delphi …

  • Steps in the Delphi method:
    • Experts given specs and forms
    • Meet to discuss product & issues
    • Complete estimation form anonymously
    • Coordinator tabulates estimates
    • Results returned to experts
    • Only personal estimate identified
    • Meet to discuss results & revise estimates
    • Repeat steps until convergence of estimates

51

slide52

Software Estimation Models: SLIM

  • Putnam SLIM:
    • Exponential analytic model
    • SizeLOC = CkK1/3td4/3
    • Where
      • K = effort in staff years
      • td = development time in years
      • Ck = Technology factor; 4,000 - 6,000 in 1979; now Ck = 10,000 - 20,000

52

slide53

Software Estimation Models: SLIM …

  • Rationale behind Putnam:
    • Developed for US Army in 1978 to estimate large software projects (over 70,000 LOC)
    • Model assumes that effort for software development projects is distributed similarly to a collection of Rayleigh curves for major milestones:
      • Requirements, design/code, test & validation, maintenance

53

slide54

Software Estimation Models: FP

  • Function Point Method:
    • Developed by Allan Albrecht at IBM in 1979
    • Albrecht wanted to create a methodology to communicate to his users about their application
    • Measures software sized based on logical functionality of system as described by a specification
    • Can compute FPs from the user’s logical perspective, independent of technology of the physical system

54

slide55

Software Estimation Models: FP …

  • Function Point Method …
    • FP Standards maintained by the International Function Point User’s Group (IFPUG)
    • Approach is to count user functionality and adjust for system complexity
    • Useful because it’s based on early information
    • Takes into account Data Function Types and Transactional Function Types

55

slide56

Software Estimation Models: FP …

  • Function Point Method …
    • Controversy with Function Points Method
    • July, 1998 – Gartner Group declares that Function Points have replaced LOC as the standard unit of measure for reporting software size
    • Rationale: FPs quantify the business requirements that the software is intended to address
    • Other experts in the field discount the benefits of using FPs for measurement

56

slide57

Software Estimation Models: FP …

  • FP Method: Trans. Function Types
    • External Inputs – File types and data elements; user data or control input types
    • External Outputs – File types and data elements; user data or control output type like reports and messages
    • External Inqueries – File types and data elements; interactive inputs requiring a response (queries)

57

slide58

Software Estimation Models: FP …

  • FP Method: Data Function Types
    • Internal Logical Files – Record element and data element types; logical master files in system
    • External Interface Files – Record element and data element types; machine-readable interfaces to other systems

58

slide59

Software Estimation Models: FP …

  • Function Point Formula:

FP = UC*(0.65 + 0.01 *TCF)

  • Where Unadjusted Count is

UC = aI + bO + cE +dL + eF

  • Technical Complexity Factor is

TCF = 0.65 + 0.01 Fi

59

slide60

Software Estimation Models: FP …

  • Unadjusted Count:

UC = aI + bO + cE +dL + eF

  • Where:
    • I = Number of Input Files
    • O = Number of Output Files
    • E = Number of Inquiry Types
    • L = Number of Logical Internal Files
    • F = Number of Interfaces
  • a, b, c, d, e are Weighting Factors …

60

slide61

Software Estimation Models: FP …

  • a, b, c, d, e > 0 are Weighting Factors determined as follows:

Type Simple Average Complex

I 3 4 6

O 4 5 7

E 3 4 6

L 7 10 15

F 5 7 10

  • Average Unadjusted Count is

UC = 4I + 5O + 4E + 10L + 7F

61

slide62

Software Estimation Models: FP …

  • Technical Complexity Factor:
    • Fi varies from 0 to 5, based on 14 factors
    • Maximum influence of 14 factors is

0.65 + 0.01 * 14 * 5 = 1.35

    • The value of the Unadjusted Count (UC) can be modified by 35% by the Technical Complexity Factors:

TCF = 0.65 + 0.01 Fi

62

slide63
F1 Data Communications

F2 Distributed Functions

F3 Performance

F4 Heavily Used System

F5 Transaction Rate

F6 Online Data Entry

F7 End User Efficiency

F8 Online Update

F9 Complex Processing

F10 Reusability

F11 Installation Ease

F12 Operational Ease

F13 Multiple Sites

F14 Facilitate Change

Software Estimation Models: FP …

  • 14 Technical Complexity Factors:

TCF = 0.65 + 0.01 Fi

63

slide64

Software Estimation Models: FP …

  • Example: Building a new application with 5 Input Files, 2 Output Files, 20 Inqueries, 4 Logical Internal Files, 3 Interfaces, with the following weights:

Type Simple Average Complex

I (5) 3 (3) 4 6 (2)

O (2) 4 5 (1) 7 (1)

E (20) 3 (12) 4 (4) 6 (4)

L (4) 7 (2) 10 (2) 15

F (3) 5 (1) 7 (1) 10 (1)

64

slide65

Software Estimation Models: FP …

  • UC = aI + bO + cE + 1dL + eF:
  • Type Simple Average Complex Total

I (5) 3 (3) 6 (2) 21

O (2) 5 (1) 7 (1) 12

E (20) 3 (12) 4 (4) 6 (4) 76

L (4) 7 (2) 10 (2) 34

F (3) 5 (1) 7 (1) 10 (1) 22

UC TOTAL 165

65

slide66
F1 Data Communications 3

F2 Distributed Functions 2

F3 Performance 5

F4 Heavily Used System 3

F5 Transaction Rate 4

F6 Online Data Entry 4

F7 End User Efficiency 3

F8 Online Update 1

F9 Complex Processing 2

F10 Reusability 5

F11 Installation Ease 2

F12 Operational Ease 2

F13 Multiple Sites 0

F14 Facilitate Change 1

Software Estimation Models: FP …

  • Technical Complexity Factors for Example:

Fi =37

66

slide67

Software Estimation Models: FP …

  • Example: Adding the components of the FP formula:

FP = UC*(0.65 + 0.01 *TCF)

FP = 165 * (0.65 + 0.01*37)

= 168.3

  • This new application is estimated to have 163 Function Points, based on the complexity factors and weighting factors

67

slide68

Software Estimation Models: FP …

  • Observed Limitations/Problems with FP:
    • Subjectivity in Technical Factors
    • Double counting complexity – weightings and TCF
    • Problems with accuracy
    • Problems with early life cycle use
    • Subjective Weightings
    • Technology Dependence

68

slide69

Software Est. Models: COCOMO II …

  • COnstructive COst MOdel – COCOMO:
    • Introduced by B. Boehm, 1981
    • Hierarchy of software estimation models
    • Original model one of the most widely used and discussed models in the industry
    • Evolved to COCOMO II, a more comprehensive model, 1996
    • Requires sizing information; options include:
      • Object Points, Function Points, LOC

69

slide70

Software Est. Models: COCOMO II …

  • COCOMO II hierarchy of estimation models address the following:
    • Application Composite Model
      • Early design stages; prototyping UI, etc.
    • Early Design Stage Model
      • Used after requirements stabilized and basic software architecture established
    • Post-Architecture-Stage Model
      • Used during the construction of the software

70

slide71

Software Est. Models: COCOMO II …

  • Application Composite Model:
    • PM = NOP / PROD
    • Where:
      • PM = Estimated effort in Person Months
      • NOP = New Object Points

NOP = [object points * (100-Reuse%)] / 100

      • PROD = Productivity (NOPS / PM)
      • PROD is subjective average of developer’s experience and maturity/capability of CASE tools

71

slide72

Software Est. Models: COCOMO II …

  • Determining number of Objects and Complexity:
  • Object TypeSimple Medium Difficult
  • Screen 1 2 3
  • Report 2 5 8
  • 3GL Component - - 10

72

slide73

Software Est. Models: COCOMO II …

  • PROD – Average of developer’s experience & maturity/capability of CASE tools from table below:
  • Very Very
  • Dev. Exp. Low Low Nominal High High
  • Very Very
  • Envir. Maturity Low Low Nominal High High
  • PROD 4 7 13 25 50

73

slide74

Software Est. Models: COCOMO II …

  • Early Design Stage and Post-Architecture Stage Models:
    • PM = A * SizeB + C
    • Where:
      • PM = Estimated effort in Person Months
      • A = Constant; 1999 A = 2.45
      • Size = KDSLOC: 1000 Delivered Source LOC; [est LOC or converted FP] / 1000
      • B = exponent based on 5 project characteristics
      • C = Automated re-engineering effort

74

slide75

Software Est. Models: COCOMO II …

  • EDM and PAM more information:
    • Both models can also take into account cost factors; EDM 7 cost factors; PAM 19
      • Complexity, reliability, DB size, documentation needs, processing constraints, etc.
    • Both models can be adjusted for re-use
      • New Source Lines of Code, Reused SLOC, Adapted SLOC, Estimated SLOC associated with reuse

75

slide76

References

  • W. S. Humphrey, Managing the Software Process, Addison-Wesley, Reading, MA, 1989
  • R. S. Pressman, Software Engineering: A Practitioner's Approach, 5th ed., McGraw-Hill, New York, 2001
  • I. Sommerville, Software Engineering, 5th ed., Addison-Wesley, Reading, MA, 1995
  • S. McConnell, Rapid Development: Taming Wild Software Schedules, Microsoft Press, Redmond, Washington, 1996
  • F. P. Brooks, Jr., The Mythical Man-month, Addison-Wesley, Reading, MA, 1975

76