1 / 64

Lecture 2

Lecture 2. Software Project Management. Topics. Project Organization Process and Project Metrics Planning and Estimation Risk Management. Additional References. Abdel-Hamid, Tarek & Madnick, Stuart (1991) Software Project Dynamics . Prentice Hall.

lorne
Download Presentation

Lecture 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 2 • Software Project Management

  2. Topics • Project Organization • Process and Project Metrics • Planning and Estimation • Risk Management

  3. Additional References • Abdel-Hamid, Tarek & Madnick, Stuart (1991) Software Project Dynamics. Prentice Hall. • Jones, Capers (1991). Applied Software Measurement. McGraw Hill. • DeMarco, Tom (1982). Controlling Software Projects. Yourdon Press. • Donaldson, S. & Siegel, S. (1997). Cultivating Successful Software Development. Prentice Hall. • Humphrey, Watts (1995). A Discipline for Software Engineering. Addison Wesley.

  4. Additional References • Fairley, Richard & Rook, Paul, “Risk Management for Software Development,” in Software Engineering, Dorfman, M & Thayer, R. (eds) (1997). IEEE Computer Society Presss. • Shumskas, Anthony, “Software Risk Mitigation,” in Total Quality Management for Software. Schulmeyer, G. & McManus, J. (eds) (1996). International Thomson Computer Press. • Sommerville, Ian (2001). Software Engineering (6th ed.). Addison-Wesley.

  5. Additional References • Sample text for a Software Development Plan http://www.airtime.co.uk/users/wysywig/sdp_1.htm • Boehm, B. (1995), Anchoring the Software Process. http://sunset.usc.edu/TechRpts/Papers/usccse95-507/ASP.html • University of Southern California - Center for Software Engineering http://sunset.usc.edu • Software Program Managers Network http://www.spmn.com • NASA Web Integrated Software Environment http://research.ivv.nasa.gov/projects/WISE/

  6. Additional References • Practical Software Measurement (PSM), Version 3.1, April 17, 1998. Office of the Undersecretary of Defense for Acquisition and Technology

  7. Topics • Project Organization • Process and Project Metrics • Planning and Estimation • Risk Management

  8. Net U.S. Productivity for New Projects Jones (1991), p. 165 J

  9. Process Factors Jones (1991), p. 169

  10. Workforce Level at IBM’sDP Services Organization Abdel-Hamid, p. 162

  11. Resource Consumption Schach, pp. 278, 279

  12. Ideal vs. Actual Staff Loading Strategies DeMarco (1982), p. 176

  13. Alternate Staffing Strategies DeMarco (1982), p. 181

  14. Chief Programmer Team Sommerville, p. 502

  15. Modern Programming Team Structure Schach, p. 98

  16. Organizational Structure for Larger Projects Schach, p. 99

  17. Decentralized Decision Making Schach, p. 100

  18. People Differences (1) • Ability to perform the work • Interest in the work • Experience with similar applications • Experience with similar tools or languages • Experience with similar techniques • Experience with similar development environment Pfleeger, p. 90

  19. People Differences (2) • Training • Ability to communicate with others • Ability to share responsibility with others • Management skills Pfleeger, p. 90

  20. Work Styles Pfleeger, p. 93

  21. Topics • Project Organization • Process and Project Metrics • Planning and Estimation • Risk Management

  22. What Are Metrics? Metric - quantitative measurement (used for tracking purposes) Process Metrics - feedback to improve the process, productivity Project Metrics - used to track project progress Product Metrics - track quality of product

  23. Schedule and Progress Milestone Performance Work Unit Progress Incremental Capability Growth and Stability Product Size and Stability Functional Size and Stability Technical Adequacy Technology Impacts Target Computer Resource Utilization Technical Performance Resources and Cost Personnel Financial Performance Environment Availability Development Performance Process Maturity Productivity Product Quality Defects Complexity Rework Common Issues - Measurement Categories

  24. Metrics Phase Coverage

  25. Measurement Application Process

  26. Core Program Metrics - As Recommended by the SEI Issues Have we ‘Planned’ for sufficient resources? Are we ‘progressing’ as planned? On budget? On schedule? How good is the product? Metrics Size Effort Cost and Schedule Quality Measure (Actual vs. Planned) Function points, SLOC, CSCIs, SU’s, etc. Labor Costs, Labor Hours, etc. Actual vs. planned cost to date, Gantt Charts, Milestones, Reviews, Deliverables, etc. Defects, Changes, Fixes, etc. How large is the job? Does my process provide valid estimates? Source: Carleton, et al., Software Management for DoD Systems: Recommendations for Initial Implementation, SEI Technical Report, Sept 1992

  27. Cost and Schedule Metrics(Earned Value Example) Projected actual end cost CONTRACT BUDGET BASE $ NOW } Actual cost of work performed Planned cost of work scheduled COST VARIANCE } SCHEDULE VARIANCE Planned cost of work performed 15 16 17 19 20 21 22 23 24 25 26 27 28 29 30 ..... end of project 18 TRR CDR CONTRACT MONTH

  28. Guidance on Metrics (1) • Set clearly defined goals (Goal-Question-Metric (GQM) paradigm) • Focus on project issues (risk management) • Begin by only collecting a few key measurements • Develop a “Project Measurement Plan” • Must have management commitment and “buy-in” from project personnel • training, briefs • allocate adequate resources; metrics analyst • set up a metrics working group

  29. Guidance on Metrics (2) • Provide timely feedback to project personnel on metrics analysis • Automate measurement collection as much as possible • minimize impact on project personnel • Measurement results should remain anonymous • do not measure individuals • all projects are different, do not compare • focus on the process and the product

  30. Guidance on Metrics (3) • Understand that metrics are not an end to themselves • metrics cannot identify, explain, or predict everything • most issues/problems require more than one measurement to characterize and understand • metrics should augment, not replace, good management and engineering judgment • Encourage feedback from measurement program participants • Plan for changes to measurement program and the types of measurements collected

  31. Topics • Project Organization • Process and Project Metrics • Planning and Estimation • Risk Management

  32. Planning in the Spiral Model Schach, p. 81

  33. Software Development Subsystems Abdel-Hamid, p. 22

  34. Project Planning - Lessons Learned (1) • Planning requires a software systems development life cycle to provide a framework for considering the specific tasks to be performed. • Planning needs to account for the interaction among management, development, and product assurance disciplines throughout the project life cycle. • Planning is an ongoing negotiation between the customer and seller. Donaldson & Sigel, pp. 32-34

  35. Project Planning - Lessons Learned (2) • Planning maps out the envisioned technical approach, resources, schedule, and milestones for the transition from the current state to a desired state. • Planning should incorporate the need for change. • Planning needs to assess risk to determine the appropriate mix of management, development, and product assurance resources. Donaldson & Sigel, pp. 32-34

  36. Project Planning - Lessons Learned (3) • Planning is required for any software systems development, and it is captured in a project plan ranging from a one-page memo to a sizable document. Donaldson & Sigel, pp. 32-34

  37. Project Planning Framework Customer Need Define the requirements Items Tasks Produce the conceptual design Historical size database Estimate the product size Customer Historical productivity database Estimate the resources Resources available Produce the schedule Management Size, resource, schedule data Delivered Product Analyze the process Develop the product Tracking reports Humphrey, p. 146

  38. Other Planning Considerations • Training • Facilities • Tools and other support hardware and software • Documentation and other deliverables • Special testing requirements

  39. Size Estimation - Lines of Code - Definitions • SLOC - Source (Single) Line of Code • KLOC - Thousand Lines of Code • PLOC - Physical Line of Code • LLOC - Logical Line of Code • CLOC - Commented Line of Code • NCLOC - Non-Commented Line of Code • DSI - Delivered Source Instruction

  40. Software Size Attributes • Length • Functionality • Complexity • Reuse

  41. Size Estimation - Wideband-Delphi • Group of experts is given program’s specification and estimation form. • Group meets to discuss project goals, assumptions, and estimation issues. • Individuals each anonymously list project tasks and a size estimate. • A moderator tabulates results. • Experts meet to discuss results by reviewing tasks that have been defined but not the estimates. • Cycle continues until estimates converge.

  42. FFP (File-Flow-Process) Metric (van der Poel & Schach) S = Size C = Cost Fi = Files Fl = Flows Pr = Processes b = Measure of efficiency Schach, p. 267

  43. Function Point Calculation Unadjusted UFP = a1(Inp) + a2(Out) + a3(Inq) + a4(Maf) + a5(Inf) Adjusted TCF = 0.65 + 0.01 x DI FP = UFP x TCF FP = Function Points TCF = Technical Complexity Factor UFP = Unadjusted Function Points DI = Degree of Influence

  44. a1 (Inp) a2 (Out) a3 (Inq) a4 (Maf) a5 (Inf) Table of Function Point Values (ai)

  45. Technical Factors for Function Point Computation (DI) Step 1: - Assign value between 0 and 5 to each of the 14 factors where 0 implies “not present or no influence” and 5 implies “strong influence throughout.” Step 2: DI is the sum of the 14 values. Schach, p. 268

  46. Object Points • Object points are an alternative function-related measure to function points when 4GLs or similar languages are used for development • Object points are NOT the same as object classes • The number of object points in a program is a weighted estimate of • The number of separate screens that are displayed • The number of reports that are produced by the system • The number of 3GL modules that must be developed to supplement the 4GL code

  47. Object Point Estimation • Object points are easier to estimate from a specification than function points as they are simply concerned with screens, reports and 3GL modules • They can therefore be estimated at an early point in the development process. At this stage, it is very difficult to estimate the number of lines of code in a system

  48. Cost Estimation Techniques • Expert Judgment by Analogy • Bottom-Up Approach • Algorithmic Cost Estimation Schach, pp. 270 - 272

  49. Structuring of Software Cost Estimation (SCE) Models Heemstra, in Dorfmann & Thayer, p. 378

  50. General Cost Estimation Structure Heemstra, in Dorfmann & Thayer, p. 379

More Related