1 / 125

SE 325/425 Principles and Practices of Software Engineering Autumn 2006

SE 325/425 Principles and Practices of Software Engineering Autumn 2006. James Nowotarski 24 October 2006. Today’s Agenda. Topic Duration Planning and metrics recap 20 minutes Analysis modeling 60 minutes *** Break Current event reports 20 minutes

laura-myers
Download Presentation

SE 325/425 Principles and Practices of Software Engineering Autumn 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SE 325/425Principles and Practices of Software EngineeringAutumn 2006 James Nowotarski 24 October 2006

  2. Today’s Agenda Topic Duration • Planning and metrics recap 20 minutes • Analysis modeling 60 minutes *** Break • Current event reports 20 minutes • Analysis modeling (cont.) 30 minutes • Design modeling 30 minutes

  3. People trump process “A successful software methodology (not new, others have suggested it):(1) Hire really smart people(2) Set some basic direction/goals(3) Get the hell out of the wayIn addition to the steps about, there's another key: RETENTION” http://steve-yegge.blogspot.com/2006/09/good-agile-bad-agile_27.html

  4. requirements Users Negotiate reqts work breakdown structure negotiated requirements Decom- pose productivity rate Estimate size deliverable size Estimate resources 5 3 4 2 1 workmonths Develop schedule Iterate as necessary schedule Planning process

  5. Work Breakdown Structure • Breaks project into a hierarchy. • Creates a clear project structure. • Avoids risk of missing project elements. • Enables clarity of high level planning.

  6. requirements Users Negotiate reqts work breakdown structure negotiated requirements Decom- pose Estimate size deliverable size Estimate resources 4 5 3 2 1 workmonths Develop schedule Iterate as necessary schedule Planning process productivity rate

  7. Units of Size • Lines of code (LOC) • Function points (FP) • Components

  8. Computing Function Points 5 15 8 32 10 40 8 80 2 10 177

  9. Components Criteria: Simple – Medium – Hard –

  10. Project Management Planning & Managing Communication project initiation requirements Modeling analysis design Construction code test Deployment delivery support top down estimating bottom up estimating

  11. Empirical Estimation Models • Empirical data supporting most empirical models is derived from a limited sample of projects. • NO estimation model is suitable for all classes of software projects. • USE the results judiciously. • General model: E = A + B X (ev)cwhere A, B, and C are empirically derived constants.E is effort in person monthsev is the estimation variable (either in LOC or FP)

  12. Be sure to include contingency The earlier “completed programs” size and effort data points in Figure 2 are the actual sizes and efforts of seven software products built to an imprecisely-defined specification [Boehm et al. 1984]†. The later “USAF/ESD proposals” data points are from five proposals submitted to the U.S. Air Force Electronic Systems Division in response to a fairly thorough specification [Devenny 1976]. http://sunset.usc.edu/research/COCOMOII/index.html

  13. requirements Users Negotiate reqts work breakdown structure negotiated requirements Decom- pose Estimate size deliverable size Estimate resources 4 5 3 2 1 workmonths Develop schedule Iterate as necessary schedule Planning process productivity rate

  14. GANTT Schedule • View Project in Context of time. • Critical for monitoring a schedule. • Granularity 1 –2 weeks.

  15. Objectives of Software Measurement • Help a systems development unit understand their performance • Evaluate performance relative to goals • Allow for comparisons to, e.g.,: • Other organizations • Alternative development approaches (custom, packaged, outsourced, etc.) and technologies • Other standards/targets • Improve estimating ability • Promote desired behaviors, e.g., reuse

  16. What is the quality of our deliverables? How predictable is our process? How quickly do we deliver? How efficient are we? Fault density Delivery rate Productivity rate Duration variance percentage GQM Example (High Level) Goal Improve systems delivery performance Question Metric

  17. = Is a single project release (Average elapsed months =14.8, n=33). Industry Average line is determined from Software Productivity Research Example: Speed of delivery 70 60 50 40 Elapsed Months 30 20 10 0 0 2000 4000 6000 8000 10000 12000 Developed Function Points

  18. Example: Schedule reliability 60% 50% 40% 30% Schedule Variance above commitment 20% = Is a single project release (n=33). Industry Average line is determined from Software Productivity Research 10% 0% 2000 4000 6000 8000 10000 12000 Developed Function Points

  19. Faults reported over the first three months in operations (n=27) An estimated industry average for faults found in the first three months of operations. The assumption is that half the total faults are found in the first three months in operation. This average is one half of the industry average of the total faults from C. Jones, Applied Software Measurement, 1996, p.232. Example: Software quality 7000 6000 5000 4000 3000 Faults (3 months) 2000 1000 0 0 2000 4000 6000 8000 10000 12000 Developed Function Points

  20. Example: Productivity 12 10 8 Is a single project release (n=33) Industry Average line is determined from Software Productivity Research. 6 Function Points per Staff Month 4 2 0 0 2000 4000 6000 8000 10000 12000 Developed Function Points

  21. Measurement and Continuous Improvement Continuous Improvement Measurement

  22. Continuous Process Improvement Approach to Quality and Measurement 1. Identify performance standards and goals Plan 2. Measure project performance 4. Eliminate causes of deficient performance - fix defects - fix root causes Act Do Check 3. Compare metrics against goals

  23. Achieve-2 Sustain Enable Achieve-1 Change Change Change Change Metrics Strategy Commitment / Ownership Metrics Rollout Education/Training Ongoing Metrics Education / Training Roles & Responsibilities Metrics Network Large Project Network Metrics Awareness Education Pilot Project Group Distributed Support Units System Building Improvement Goals Dashboard metrics Implementation Vital Few Metrics Definitions Vital Few Metrics Implementation Measurement Process Definition Measurement Process Improvement Metrics Definition & Implementation for Delivery Centers Technology Strategy Metrics Repository and tools KM Support for Measurement Community of Practice Metrics Programs Need to Address People, Process, Technology Enable Large Projects and Remaining Centers Pilot Selected Projects and Selected Delivery Centers People Process Metrics Embedded in System Building Methods Technology QUALITY MANAGEMENT PROGRAM MANAGEMENT

  24. Most metrics programs fail within first 2 years Reasons • Lack of [visible] executive sponsorship • Lack of alignment with organizational goals • Tendency to collect too much data • Measures not calibrated, normalized, or validated • Not comparing apples-to-apples • Fear of [individual] evaluation • Learning curve (e.g., function points) • Cost overhead

  25. Key Success Factors • Ensure that measurement is part of something larger, typically performance improvement • “Trojan Horse” strategy • Ensure alignment with organizational goals • Start small, iterate • Strongly recommend doing a pilot test • Automate capture of metrics data • Rigorously define a limited, balanced set of metrics • “Vital Few” • Portfolio approach • Comparability • Aggregate appropriately • Focus should be on processes, not individuals • Obtain [visible] executive sponsorship • Understand and address the behavioral implications

  26. Other Quotes “Count what is countable, measure what is measurable, and what is not measurable, make measurable” Galileo

  27. Other Quotes “In God we trust – All others must bring data” W. Edwards Deming

  28. Some Courses at DePaul • SE 468: Software Measurement and Estimation • Software metrics. Productivity, effort and defect models. Software cost estimation. PREREQUISTE(S):CSC 423 and either SE 430 or CSC 315 or consent • SE 477: Software and System Project Management • Planning, controlling, organizing, staffing and directing software development activities or information systems projects. Theories, techniques and tools for scheduling, feasibility study, cost-benefit analysis. Measurement and evaluation of quality and productivity. PREREQUISTE(S):SE 465 or CSC 315

  29. Today’s Agenda Topic Duration • Planning and metrics recap 20 minutes • Analysis modeling 60 minutes *** Break • Current event reports 20 minutes • Analysis modeling (cont.) 30 minutes • Design modeling 30 minutes

  30. Context Planning & Managing Communication project initiation requirements Modeling analysis design Construction code test Deployment delivery support elicitation Requirements engineering tasks (Ch. 7-8) elaboration specification analysis model software reqts spec Primary deliverables functional reqts non-functional reqts

  31. Use case often first part of analysis model to be developed Planning & Managing Communication project initiation requirements Modeling analysis design Construction code test Deployment delivery support elicitation Requirements engineering tasks (Ch. 7-8) elaboration specification refined use cases Use case deliverables preliminary use cases

  32. Use cases • A user scenario • A thread of usage • Tells a story of an actor (end user or device) interacting with the system

  33. Use-Case Diagram

  34. Use case description – narrative “If I’m at a remote location, I can use any PC with appropriate browser software to log on to the SafeHome Products Web site . . . “ Use-case: Activate the system

  35. Use case description – ordered sequence • The homeowner observes . . . • The homeowner uses the keypad . . • The homeowner selects and keys in stay or away . . . • When activation occurs . . . Use-case: Activate the system

  36. Use case description - template Use-case: ActivateSystem Actor: Homeowner Pre-conditions: Trigger: Scenario: Exceptions: . . . Use-case: Activate the system

  37. Analysis model • Combination of text and diagramming • Depicts requirements: • Data • Function • Behavior

  38. Key Definitions • A data model is a • Formal representation of the data to be used for a business system. • A data model should illustrate: • The people, places and things about which data is collected, • And how they are related to each other

  39. Data Modeling A logical data model deliverable includes an ERD and descriptions of entities, attributes, and relationships • Entity-relationship diagram (ERD) • Entity descriptions • Attribute descriptions • Relationship descriptions Components of a Logical Data Model

  40. Entities and Instances Instances are occurrences of an entity

  41. Examples of Attributes • Entity: Person • Attributes: • first_name • last_name • eye_color • date_of_birth • address • Entity: Classroom • Attributes: • room_no • max_capacity

  42. Depicting Entities, Attributes and Identifiers Entity name Identifier Or, use cd_id (PK) Attributes

  43. Identifiers • An identifier should have the following characteristics: • Its value should not change over the life of each instance • Its value should never be “null” or empty • It should be composed of the minimal number (preferably one) of attributes required to ensure uniqueness

  44. Relationshipsrepresent connections, links, or associations between entities e.g., Patient-Appointment Relationships have some important properties: Names, that should be active verbs (A patient schedules appointments) Cardinality Modality. Relationships

  45. Cardinality: Implies that an employee is assigned to only one department. Cardinality: Implies that there may be many employees in a department Modality: Mandatory implies that an employee MUST be assigned to a department Modality: Optional implies that there may be a situation in which a department has no employees. Cardinality and Modality Department Employee contains is assigned to There are several other ERD notations, unfortunately there is no single standard!

  46. Sample Notation Cardinality MinMax 0 1 1 1 0 M[any] 1 M[any]

  47. Cardinality refers to the number of times instances in one entity can be related to instances in another entity One instance in an entity refers to one and only one instance in the related entity (1:1) One instance in an entity refers to one or more instances in the related entity (1:M) One or more instances in an entity refer to one or more instances in the related entity (M:M) Cardinality

  48. ModalityIndicates whether a particular data object MUST participate in the relationship. Modality = 0: No particular need for the relationship. Modality = 1: Relationship is mandatory refers to the minimum number of times that an instance in one entity can be related to an instance in another entity One means that the relationship is mandatory Zero means the relationship is optional Modality

  49. The Entity-Relationship Diagram (ERD)

  50. An ERD Example A Vendor must sell at least 1 CD

More Related