1 / 31

Pair Development Framework

Pair Development Framework. Monvorath (Molly) Phongpaibul. Outline. Pair Development Guideline Pair Development management issues. Pair Development Guideline. The ETVX Paradigm. Task. Exit. Entry. Validate. Fagan’s Inspection. Inspection. OK. Yes. Entry. Task. Valid?. Exit. No.

adrina
Download Presentation

Pair Development Framework

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pair Development Framework Monvorath (Molly) Phongpaibul

  2. Outline • Pair Development Guideline • Pair Development management issues

  3. Pair Development Guideline

  4. The ETVX Paradigm Task Exit Entry Validate

  5. Fagan’s Inspection Inspection OK Yes Entry Task Valid? Exit No

  6. Pair Development Pair Development Task OK Entry Exit Validate

  7. SSRD Modeling Where can pair activity take place? ** Activities with “Anchor” points: MBASE/RUP

  8. Personal Software Process (PSP) and Collaborative Software Process (CSP)

  9. Fagan’s Inspection Process

  10. Pair Development Processes Plan, Capabilities & Requirements Pair Requirement Modeling Test-plan Pair Design & Analysis Pair Testing Test cases Pair Programming Testing

  11. Pair Requirements Modeling Activity Entry Criteria • Operational Concept Description (OCD) • Problem Description • Organization Background • Organization Goals & Constraints • Project Goals & Constraints • Current System • SSRD Guideline (for model) • SSRD / Requirement Checklist (if provided) • Defect Classification List

  12. Pair Design and Analysis Activity Entry Criteria • System and Software Requirement Description (SSRD) • SSAD Guideline • SSAD / Architecture Design Checklist (if provided) • Defect Classification List

  13. Pair Programming Activity Entry Criteria • System and Software Architecture Description (SSAD) • Coding Standard • Coding Checklist (if provided) • Defect Classification List • Test Cases and Test Description (if provided)

  14. Pair Test Plan Activity Entry Criteria • System and Software Requirement Description (SSRD) • System and Software Architecture Description (SSAD) • Risk Items • Test Plan Guideline / Standard • Test Plan Checklist (if provided) • Test Coverage Checklist

  15. Pair Test Testing Activity Entry Criteria • Code / Program • System and Software Requirement Description (SSRD) • Test Plan • Test Description Guideline / Standard • Test Result Guideline / Standard • Test Coverage Checklist

  16. Pair Development Meta Task Model

  17. Preparation Sub Task Objective: • Evaluate the entry criteria and understand what is available. • Educate the developer . • Familiarize the developer with the entry criteria such as requirement specification, design specification, checklists and test cases.

  18. Planning Action Sub Task Objective: • Make agreement on what needs to be done and what is the methodology of how to solve the problem. • Ensure that the pair have the same goals and objectives. • Make agreement on what is the communication protocol • Reduce jelling time.

  19. Execution Sub Task Objective: • Implement the artifact based on the goals and objectives which is set in purpose action sub task. • Ensure that the artifact being produced is follow the methodology which is discussed in purpose action sub task. • Collect the useful data for quality control and process improvement.

  20. Execution Sub Task (Pair Testing) Execution (Testing) Generate Test Cases Effort and Defect Data Collection Run Test Cases Test Cases /Description Yes Pass Test Case Test Results No Identify Defects

  21. Execution Sub Task (Pair Testing) Objective: • Implement the test cases based on goals and objectives which are set in purpose action sub task. • Ensure that the test cases being produced is follow the methodology which discuss in purpose action sub task. • Ensure the test cases is full coverage. • Collect the useful data for quality control and process improvement.

  22. Execution Sub Task (Pair Testing)

  23. Assurance Sub Task Objective: • Ensure that the work product meet the exit criteria • Ensure that all the concerns have been discuss. • Ensure that there is no implication of the define defects. • Ensure that the artifact being produced reaches the quality goals which set up in purpose action sub task.

  24. Pair Development Management Issues

  25. Planning Issues • Which activities and which modules should be done by a pair? • Risk-based • Value-based • How to allocate individual to the pair? • Pair Compatibility • Jelling Time vs. Training Time • Pair Rotation • When the pair member should switch roles (driver/observer)? • What is the environment when working in a pair?

  26. Peer Review Decision Model Peer Review ?

  27. Measurement • What to be measure and how can we capture the data? • Development Effort • Defects List • Defect Type: (http://www.research.ibm.com/softeng/ODC/ODC.HTM ) • Size of Work Product • Pair Development data collection ( four different forms) • Plan Announcement • Pair Development Log • Area of Concern Log • Summary • QMIS ( http://morro.usc.edu:12345/qmis/login.jsp )

  28. What Is a Defect? • An instance of non-conformance with the initiating requirements, standards, or exit criteria • Can exist in the accuracy/completeness of requirements, standards, and associated interface/reference documents • Identified by team consensus during inspection meeting based on requirements/standards

  29. Defect Categories • Severity a. Major • A Condition that causes an operational failure, malfunction, or prevents attainment of an expected or specified result • Information that would lead to an incorrect response or misinterpretation of the information by the user • An instance of non-conformance that would lead to a discrepancy report if implemented as is b. Minor • A violation of standards, guidelines, or rules, but would not lead to a discrepancy report • Information that is undesirable but would not cause a malfunction or unexpected results (bad workmanship) • Information that, if left uncorrected, may decrease maintainability

  30. Defect Categories (continued) Class a. Missing • Information that is specified in the requirements or standard, but is not present in the document b. Wrong • Information that is specified in the requirements or standards and is present in the document, but the information is incorrect c. Extra • Information that is not specified in the requirements or standards but is present in the document Type • Describes what kind of areas the inspection document has defects in (often an “-illity”; e.g. grammar, syntax) • Inspection teams should define and tailor the classes to the work product

  31. Questions and Recommendation - Thank You -

More Related