1 / 53

Producing Production Quality Software

Producing Production Quality Software. Lecture 14: Group Programming Practices Prof. Arthur P. Goldberg Fall, 2004. Topics. Logistics MIS ‘Best Practices’: Capers Jones’s view Final Assignment Improving Software Management Effort Estimation Literature Productivity.

fortson
Download Presentation

Producing Production Quality Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Producing Production Quality Software Lecture 14: Group Programming Practices Prof. Arthur P. Goldberg Fall, 2004

  2. Topics • Logistics • MIS ‘Best Practices’: Capers Jones’s view • Final Assignment • Improving Software Management • Effort Estimation • Literature • Productivity

  3. Oral Exam Logistics • Out morning of Dec. 11, with 2 example answers • Last day answering questions about the final is Dec. 14 • Exam session • 20 minutes, location TBD • Be on time, or be rescheduled • 5 questions, 4 minutes each 2 you pick 2 chosen at random 1 I pick for you • Closed book • While answering, you may talk or write on paper or a white board • Results will be released on about 12/23, after everyone has taken the exam

  4. HW5 Logistics • Last day answering questions is Dec. 9 • Grading • With Chris • Chris will read your code in advance • Exam session • 20 minutes, location TBD • Be on time, or be rescheduled • Chris gives you a file containing what you emailed us, and you run it • Chris gives you some sequences of commands, and you run them • Different students will receive different command sequences, depending on what needs testing • Results will be released after everyone has had HW5 graded

  5. Best Practices • As influences • Quality • Productivity

  6. Software Development Quality • Capers Jones and Software Productivity Research’s quantitative approach • Material from “Capers Jones, Software Assessments, Benchmarks, and Best Practices, Addison-Wesley Pub Co; 1st edition (April 28, 2000)” • Measure 9000 development projects from 600 organizations • In 150 of the Fortune 500 • In 100 small companies • In many government and military organizations

  7. Classification • Classify organizations and projects by • Client country • Industry • Project ‘nature’ • Project scope • By software type • Project class • 37,400 permutations • Calibrate and benchmark measures of productivity and defect rate

  8. Project ‘Nature’ • New Development • Enhancement (new functions added to existing software) • Mandatory change (updates for new statutes or regulations) • Maintenance (defect repairs to existing software) • Performance updates (revisions needed to improve throughput) • Conversion or adaptation (migration to a new platform) • Nationalization (migration to a new national language) • Reengineering (re-implementing a legacy application) • Mass update (modification for the Euro or Y2K) • Hybrid (concurrent repairs and functional additions)

  9. Project Scope • Subroutine or sub-element of a program • Module of a program • Reusable module or object • Disposable prototype • Evolutionary prototype • Stand-alone program • Component of a system • Release of a system • New system or application • Compound system (multiple linked systems)

  10. Non procedural (spreadsheet, query, generators, and so forth) Web Applet Batch application Interactive application Batch database application Interactive database application Pen-based application Client/server application (two tier) Client/server application (three tier) Enterprise resource planning (ERP) application Scientific or mathematical application Systems or hardware control application Communications or telecommunications application Process control application Embedded or real-time application Trusted system with stringent security Graphics, animation, or image-processing application Robotic or manufacturing control application Expert system with substantial knowledge acquisition Artificial intelligence application Neural net application Hybrid project (multiple types) Software types

  11. Personal Application for private use Personal application to be shared by others Academic program developed in an academic environment Internal application to be installed at one location Internal application to be accessed via an intranet or time-sharing Internal application to be installed at many location Internal application developed by contract personnel Internal application developed using military standards External application, to be freeware or shareware External application to be placed on the World Wide Web External application leased to users External application embedded in hardware External applications bundled with hardware External application marketed commercially External application developed under outsource contract External application developed under government contract External application developed under military contract Project class

  12. General project classes End-user applications - developed privately for personal use Information systems - developed in-house for corporate use (MIS) Outsource or contract projects - developed under legally binding contract Commercial software - developed to be marketed to external customers Systems software - developed to control physical devices Military software - developed using military standards

  13. Review ‘Group Programming Practices Handout’ • ‘Best Technical Practices for MIS Software’ • Distribution of SPR Project Benchmarks circa 1999

  14. The AMS • Changes in the specification • Questions

  15. Software Effort Estimation • Issues • When will the bleeping system be done? • How much will it cost? • If I give you Joe’s team, when will it be ready? • Approaches • Little or no estimation: XP • Function Point Analysis • Lister’s “Estimating Quality Factor” • Wideband Delphi

  16. Software Effort Estimation Function Point Analysis

  17. Function points • A conceptual measure of complexity • A linear sum, approximately • Simplistically, FPC = External_inputs * 4 + External_outputs * 7 + External_inquiries * 5 + Internal_logical_files * 4 + External_interface_files * 10 • Accuracy • To +/- 10% • IFPUG study by Kemerer at MIT in 1993 • Rarely used

  18. Functional complexity modifier • The multipliers vary with complexity • Data element type (DET) count • File type reference (FTR) count • Review tables in ‘Group Programming Practices Handout’

  19. FP ‘Priesthood’: International Function Point User’s Group www.ifpug.org • Rules Version 4.1 1999 • Certified Function Point Specialist • Examination • http://www.spr.com/products/function.htm, http://www.spr.com/products/programming.htm

  20. Backfiring • Convert SLOC to FPs • Driven by tables in ‘Group Programming Practices Data’ • About +/- 20% accurate

  21. Software Effort Estimation: EQF • Tim Lister’s “Estimating Quality Factor” • Lister • The Atlantic Systems Guild, Inc. • Books • DeMarco and Lister, Waltzing With Bears: Managing Risk on Software Projects. Dorset House, 2003. • DeMarco, Tom, and Timothy Lister: Peopleware: Productive Projects and Teams. Dorset House, New York, 1999. • Seminars • Risk Management for Software • Leading Successful Projects

  22. Software Effort Estimation Wideband Delphi

  23. Delphi History • Consensus without conflict • Invented by RAND • Santa Monica ‘thinktank’ • Used to estimate nuclear bombing damage • Many other applications • Wideband Delphi • Developed by Barry Boehm in 70s; see his Software Engineering Economics

  24. Wideband Delphi Overview • Input • Specification • Outputs • Detailed task list • Estimation assumptions • Effort estimates from each participant

  25. Wideband Delphi Process Overview • Form group of experts • Estimate: Each group member provides an anonymous estimate • Coordinator assesses the estimates • Asks for reassessment of crazy estimates • Combines estimates • Presents combined results to experts • If combined results do not converge then goto Estimate

  26. Wideband Delphi • Benefits • Practical • Employ multiple minds; promote reasoning • Accurate estimate avoids the underestimate dilemma of “skip & compromise quality vs. blow schedule” • Build comprehensive task list • Psycho-social • Reduce bias by influential people, or those with divergent agendas • May resolve disagreement among hostile parties • ‘Buy-in’ of ‘stakeholders’ • Other • Acknowledges uncertainty • Drawbacks / costs • Time • Takes power away from manager or ‘guru’ (their drawback)

  27. Wideband Delphi - 1 • Each individual develops an estimate which is list of • Tasks • Assumption(s) • Effort • Rules • Assume you’ll do all the work • Assume uninterrupted effort

  28. Wideband Delphi – 2 • Moderator presents • Distribution of effort (anonymous) estimates • All task lists (or merged task lists) • All participants modify estimates concurrently and secretly

  29. Wideband Delphi • Termination—at earliest of • 4 rounds • Acceptable convergence • Meeting time over • Nobody willing to change • Completion • Moderator assembles the tasks into single list • Moderator merges assumptions • Moderator combines estimates • Possibilities • Ave • Min, ave, max • Ave and std-dev

  30. Wideband Delphi Practice • Lets try it on the AMS • Use the forms in the handout • Break into groups of 3 or 4 • Record your effort for the final assignment • Compare

  31. Improving Software Management • Improve an organization • Involves • Technology • Management • Psychology • Approaches • CMM • SPR • Weinberg

  32. Software Process Assessment Approaches • SEI CMM – focus: mandated activities, and change processes • Review ‘Group Programming Practices Handout’ • Five Levels of the SEI CMM • Five Levels of the SPR Excellence Scale • Approximate Conversion Between SPR and SEI Software Scores • Trained assessors • SPR – focus: project

  33. SPR assessment • SPR consultant holds a meeting with project manager and up to 7 technical staffers – 2 to 4 hours • Official questionnaire • Group consensus answers • Finding themes • Findings about the projects or software products assessed • Findings about the software technologies utilized • Findings about the software processes utilized • Findings about the ergonomics and work environments for staff • Findings about personnel and training for management and staff

  34. SPR assessment example • Large telecom • One software lab, 450 people • MIS • 30 projects, related to core business

  35. Example strengths (better than telecom industry average) • Requirements analysis (quality function deployment [QFD]) • Change control methods (fully automated with traceable requirements) • Project management: quality measurement and metrics • Design methods, fully automated • Customer support tools, fully automated • Customer support methods, both hot-lines and Internet support • Maintenance release testing • Regression testing • Development testing • Performance modeling • Performance testing • Lab testing of hardware and software concurrently • Support for telecommuting and remote employees • Support for multi-site projects

  36. Example average (WRT other major telecoms) factors • Staff experience with application types • Staff experience with in-house development processes • Staff experience with development tools • Staff experience with programming languages • Staff specialization: testing • Staff specialization: quality assurance • Staff specialization: maintenance • Quality Control: use of formal design inspections • Quality Control: use of formal code inspections • Quality Control: formal software quality assurance teams • Unit testing by development personnel

  37. Example weaknesses (worse than industry average) • Project Management: annual training in state-of-the-art methods • Project Management: cost estimating • Project Management: quality estimating • Project Management: risk analysis • Project Management: value analysis • Project Management: schedule planning • Project Management: lack of productivity measurements • Project Management: lack of productivity metrics • Maintenance: no use of complexity analysis • Maintenance: no use of code-restructuring tools • No reuse program: requirements • No reuse program: design • No reuse program: source code • No reuse program: test materials • No reuse program: documentation • No reuse program: project plans

  38. Literature

  39. Au revoir et merci

  40. 10 Key Process Areas for Software Process Assessment • Project Management methods such as estimating • Requirements-gathering and analysis methods • Design and specification methods • Coding methods • Reusability methods • Change control methods • User documentation methods • Pretest defect removal methods such as inspections • Testing methods and tools • Maintenance methods and tools

  41. 10 Key Personnel-related Topics for Software Process Assessment • Staff hiring practices • Staff training and education • Management training and education • Specialists and occupational groups • Compensation levels • Office ergonomics • Organizational structures • Morale surveys and results • Work patterns and overtime • Staff turnover rates

  42. Productivity

  43. Benchmark studies • Compare like with like, i.e., within Classification • Percent of corporate employees in information technology • Number of users supported per staff member in information technology • Annual corporate spending for computer hardware equipment • Revenues per information technology employee • Sales volumes correlated to information technology expenses • Profitability correlated to information technology expenses • Average salaries for information technology workers • Numbers of specialist occupations employed within information technology • Annual attrition or turnover among information technology employees • Corporate revenues per information technology worker

  44. Benchmark • Achievement • Expense • Customer satisfaction

  45. Missing benchmarks • Numbers and sizes of corporate databases • The gaps associated with database benchmarks reflect an interesting problem. As this book is written there are no metrics available for expressing the size or volume of information in a database. For software, both LOC metrics and function point metrics have long been available for performing benchmark studies. However, the database domain does not have any size metrics at all. Due to the lack of data metrics, there are no benchmarks on many critical topics in the database, data warehouse, and data mining domains. • As this book is written there are no published studies that explore how much data a corporation owns, the numbers of errors in the data, how much it costs to create a database, the value of the data, or the eventual costs of removing obsolete data. Thus, a topic of considerable economic importance is, essentially, a void in terms of benchmark studies.

  46. Benchmark client questions • What are the best-in-class productivity rates in our industry? • What are the best-in-class development costs in our industry? • What are the best-in-class development schedules in our industry? • What are the best-in-class maintenance assignment scopes in our industry? • What are the best-in-class quality levels in our industry? • What are the best development processes for software like ours? • What are the best development tools for software like ours? • What does it cost to move from SEI CMM level 1 to SEI CMM level 3? • What are the difference in results between SEI CMM level 1 and CMM level 3?

  47. Size/accomplishment measures • “Source lines of code” (SLOC) • Advantages • Easy • Easily automated • Used by many tools • Problems • CJ P 71

  48. The functionality of the application The volume of information in databases The quantity of new source code to be produced The quantity of changed source code, if any The quantity of deleted source code, if any The quantity of base code in any existing application being updated The quantity of "dead code" no longer utilized but still present The quantity of reusable code from certified or uncertified sources The quantity of paper deliverables (plans, specifications, documents, and so on) The sizes of paper deliverables (pages, words) The number of national languages (English, French, Japanese and so on) The number of on-line screens The number of graphs and illustrations The size of nonstandard deliverables (for example, music, animation) The number of test cases that must be produced The number of bugs or errors in requirements The number of bugs or error in specifications The number of bugs or errors in source code The number of bugs or error in user manuals The number of secondary "bad fix" bugs or errors 20 Software Artifacts for which size information is important

  49. Cost Factors • Variations due to inflation rates for studies spanning several years • Variations due to industry compensation averages • Variations due to company size • Variations due to geographic regions or locations • Variations due to merit appraisals or longevity • Variations due to burden rate or overhead differences • Variations due to work patterns and unpaid overtime

  50. productivity paradox • CJ p 73

More Related