1 / 31

Using Software Project Courses to Integrate Education and Research

Using Software Project Courses to Integrate Education and Research. Barry Boehm November 18, 2009. Outline. Nature of real-client project course Primarily USC campus, neighborhood e-services MS-level; 2 semester; 6-8 person teams Well-instrumented for continuous improvement

carol
Download Presentation

Using Software Project Courses to Integrate Education and Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Software Project Courses to Integrate Education and Research Barry Boehm November 18, 2009

  2. Outline • Nature of real-client project course • Primarily USC campus, neighborhood e-services • MS-level; 2 semester; 6-8 person teams • Well-instrumented for continuous improvement • Research/education integration via project experiments • Validate new methods and tools via project usage • Partial basis of 12 PhD dissertations • Rqts. negotiation, formalization (3), COTS integration (2), Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1) • Conclusions

  3. 2009-10 Software Engineering Projects

  4. Domain Model determines determines identifies identifies WinWin Taxonomy Stakeholders, Primary win conditions Frequent Risks Basic Concept of Operation focus use of focus use of situates exercise exercise i n i t i a l i z e s determines WinWin Negotiation Model IKIWISI Model, Prototypes, Properties Models Environment Models provides inputs for guides determination of validate WinWin Agreements, Shared Vision update update initialize adopt identify identify Requirements Description Viable Architecture Options Outstanding LCO risks Life Cycle Plan elements Updated ConOps, Business Case feasibility, iterate to achieve consistency determines exit LCO Rationale Anchor Point Model criteria for validates readiness of Life Cycle Objectives (LCO) Package MBASE Model Integration: LCO Stage

  5. S&C Subdomain (General) Type of Examples (project nos.) Simple Block Diagram Developer Simplifiers Developer Complicators Application · · Use standard Natural language 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 31, 32, 35, 36, 37, 39 query languages processing · · Use standard or Automated COTS search cataloging or query engine indexing MM asset Catalog · · Uniform media Digitizing large info formats archives Multimedia update query · Digitizing Archive notification complex or fragile update artifacts MM MM asset · Archive Rapid access to large Archives · Access to heterogeneous media collections · Automated annotation/descrip tion/ or meanings to digital assets · Integration of legacy systems

  6. The Results • Projects That Failed LCO Criteria • Post-1997 failures due to non-S&C causes • Team cohesion, client outages, poor performance

  7. Outline • Nature of real-client project course • Primarily USC campus, neighborhood e-services • MS-level; 2 semester; 6-8 person teams • Well-instrumented for continuous improvement • Research/education integration via project experiments • Validate new methods and tools via project usage • Partial basis of 12 PhD dissertations • Rqts. negotiation, formalization (3), COTS integration (2), Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1) • Conclusions and references

  8. Empirical Software Engineering Research • Empirical software engineering research generally slow • Projects take 2-5 years to complete • Improvements confounded with other factors • Data generally sparse, hard to compare across projects • Team projects the ESE equivalent of the fruit fly • 20 per year, real clients, using industrial-grade processes • Teams include 2 off-campus working professionals • 1-2 of 6 on-campus students have 1-2 years work experience • Extensive data, consistently collected • Opportunities to run (partially) controlled experiments • Projects, teams not identical

  9. Project Course Experience Factory Projects OrganizationExperience Factory environment characteristics 1. Characterize 2. Set Goals 3. Tailor Process Project Support 6. Technology tailorable Initialize technology, mentoring products, lessons learned, models Apply, Refine Execution plans Experience Base Formalize project analysis, process modification Disseminate 5. Analyze 4. Execute Process data, lessons learned

  10. WikiWinWin – Tool Initial ideas surfaced at the meeting Shaper organize them into a prospective win condition after the meeting Stakeholders engage in a further discussion Shaper facilitate negotiation in WikiWinWin

  11. WikiWinWin – Current Progress LCO package quality shortfall vs. usage by team LCO package quality shortfall vs. usage by shaper • Initial Results (fall 07) • Correlation between usage aspect and outcome • Not all positive feedbacks

  12. Growth of COTS-Based USC e-Services Projects • Requires new processes, architectures, methods • It’s not “all about programming” anymore • Similar trends at Motorola, other USC-CSE Affiliates * * Industry: 2001 Standish Report

  13. Axiom 1. Process Happens Where the Effort Happens • COTS Assessment, Tailoring, Glue Code CCode/Integration a.CBA Effort Distribution of USC E-Service Projects b. CBA Effort Distribution of Industry COCOTS Calibration Data

  14. CBA Spiral Framework

  15. COTS Assessment Example • USC Collaborative Services (USCCS): • Objectives • Project management • File management • Discussion board • Project calendaring • Chat room • COTS Candidates • eProject (577a and 577b) • Dot Project (577a) • eStudio (577a) • eRoom (577a) • Blackboard (577b) • iPlanet (577b)

  16. Example: USCCS Evaluation Results

  17. Interoperability Evaluation Framework Interfaces

  18. iStudio Tool

  19. Experiment 1 Results * Accuracy of Dependency Assessment: 1 – (number of unidentified dependencies/total number of dependencies) ** Accuracy of Interface Assessment: 1 – (number of interface interaction mismatches identified/total number of interface interactions) Accuracy: a quantitative measure of the magnitude of error [IEEE 1990]

  20. Model Clashes Occurrence Probability & Severity: 35 Projects RAD Sample S: Stakeholder; M: Model; OP: Occurrence Probability; SV: Severity

  21. Clash Types and their Contribution to Project Risk 1.3 1.4 0.8 Success- Success- Success- Product- Process- Property- Product- Process- Product- Success- Property Product Success Property Property Property Product Process Process Process 4 12 3 16 4 13 30 7 6 5 % of Clashes 6 17 4 20 5 12 24 5 4 3 % of Risk • Majority of research (product-product) addresses minority of risk

  22. 100 80 60 % Total Risk 40 20 0 Intra Model Clashes Inter Model Clashes Inter and Intra Model Clashes and their Contribution to Project Risk Contribution of Inter and Intra Model Clashes to Risk Distribution of Inter and Intra Model Clashes 100 80 60 53% 55% 47% 43% % Total Model Clashes 40 20 0 Inter Model Clashes Intra Model Clashes • Inter model clashes caused majority of risk

  23. Value-Based Review Process (II) Users Reviewing Artifacts Developers Negotiation Priority Priorities of system capabilities Customers High Medium Low Meeting Criticality High 1 4 6 Medium 2 5 optional Other stakeholders Low 3 optional optional General Value-based checklist Number indicates the usual ordering of review* Criticalities of issues Domain Expert Artifacts-oriented checklist * May be more cost-effective to review highly-coupled mixed-priority artifacts.

  24. Value-Based Checklist (I) <General Value-Based Checklist>

  25. Value-Based Reading (VBR) Experiment— Keun Lee, ISESE 2005 • Group A: 15 IV&V personnel using VBR procedures and checklists • Group B 13 IV&V personnel using previous value-neutral checklists • Significantly higher numbers of trivial typo and grammar faults Experiment

  26. Pair Development vs. Fagan InspectionTDC = Total Development Costs

  27. Lean MBASE Effort Comparison Average number of hours spent for documentation: Less Effort, except SSAD in Fall 2005 Average number of hour/page in documentation: Less number of hours per page; except SSRD in Fall 2006

  28. ICM Electronic Process Guide

  29. Integrating Software Research, Education • Empirical software engineering research generally slow • Projects take 2-5 years to complete • Improvements confounded with other factors • Data generally sparse, hard to compare across projects • MS-student projects the ESE equivalent of the fruit fly • 20 per year, real clients, using industrial-grade processes • Extensive data, consistently collected • Opportunities to run (partially) controlled experiments • Projects, teams not identical • Results frequently correlate with industry experience • Results strengthen future educational experiences

More Related