1 / 32

Software Engineering Research / Developer Collaborations

Software Engineering Research / Developer Collaborations. Tom Pressburger, Research Infusion Lead (ARC) Ben Di Vito (LaRC), Martin Feather (JPL), Michael Hinchey (GSFC), Lawrence Markosian (QSS Group, ARC),Tim Menzies (Portland State Univ., IV&V), Luis Trevino (MSFC). Outline.

vansampson
Download Presentation

Software Engineering Research / Developer Collaborations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering Research / Developer Collaborations Tom Pressburger, Research Infusion Lead (ARC)Ben Di Vito (LaRC), Martin Feather (JPL), Michael Hinchey (GSFC), Lawrence Markosian (QSS Group, ARC),Tim Menzies (Portland State Univ., IV&V), Luis Trevino (MSFC) SAS'04 Technical Presentation

  2. Outline • Research Infusion Goal, Strategy and Process • Goal and Strategy • Research Identification & Selection • Proposal Solicitation & Selection • Research Infusion 2004 Technologies and Collaborations • Research Infusion 2005 Technologies and Proposals • Success Criteria for Research Infusion SAS'04 Technical Presentation

  3. Research InfusionGoal, Strategy and Process SAS'04 Technical Presentation

  4. Goal and Strategy • Infuse into practice higher-maturity, high potential value, primarily (but not solely) NASA-funded software engineering research. • Do this by funding small-scale collaborations between researchers and innovators or early adopters, • where the Research Infusion funding is used primarily to cover the cost of technology insertion(primarily training & consulting support) and evaluation. • Primary funding from OSMA/SARP, plus significant co-funding SAS'04 Technical Presentation

  5. Research Identification & Selection • Asked software engineering research program managers which of their research technologies were sufficiently mature and promising. • Solicited input from software developers about software engineering research they wanted to try out. • Surveyed research PIs. • Evaluated research technologies against selection criteria SAS'04 Technical Presentation

  6. Research Evaluation Criteria Aim for “quick win”: • Has been applied to real problems. • Exceptional potential • Range of applicability. • Expected productivity improvement. • Low risk of failure for technical reasons (e.g., won’t scale) • Focus on software quality—in accordance with OSMA funding goals SAS'04 Technical Presentation

  7. Research Evaluation Criteria(continued) • Ease of integration into a software development project • Requires little change to current development process. • Reasonable training. • Doesn’t need widespread utilization. • For tools, a decent user interface. • Support is available. • Value is clearly apparent to intended users. • Intended users won’t resist it. SAS'04 Technical Presentation

  8. Proposal Solicitation & Selection:Research Infusion ViTS (09/23/2003, 5/18/2004) Purpose: Solicit proposals from developers for technology infusion pilots 200 Attendees 11 NASA Centers Audience: Engineers, Leads, Managers, Quality assurance civil servants & contractors Profiles captured in database SAS'04 Technical Presentation

  9. Proposal Solicitation & Selection:Proposal Development • Highly focused CFP • CFP publicized among software development organizations • NASA • NASA Contractors • We encourage proposal writers to communicate with us and obtain feedback on draft proposals before the deadline. • Our goal is to obtain only high-quality, appropriate proposals, • NOT to solicit many proposals and filter them • Proposal process reduces work on everyone’s part and results in high acceptance rate. SAS'04 Technical Presentation

  10. Proposal Solicitation & Selection:Proposal Review • Proposal evaluation criteria: • Feasibility: are the skills of the participants relevant, etc. (0.3) • Impact on NASA: is the technology being applied to an important project. (0.35) • Likelihood if successful that it will become part of the development team's practice. (0.2) • Adequate feedback is provided to the researchers and the Research Infusion team: for example, calls to the researchers about possible bugs; metrics data; final report). (0.05) • Good use of NASA funds. (0.1) • Award Recommendations to SARP SAS'04 Technical Presentation

  11. Research Infusion 2004Technologies and Collaborations SAS'04 Technical Presentation

  12. 2004 Technologies • C Global Surveyor (NASA Ames, CICT) • Static analysis defect detection tool • Usability & Architecture (CMU, ECS) • Architecture design methodology • CodeSurfer (GrammaTech) • Reverse engineering/debugging toolset • Perspective-based Inspections (Fraunhofer, SARP) • Software inspection methodology • Coverity SWAT (Coverity) • Static analysis defect detection tool • Orthogonal Defect Classification for NASA (JPL, SARP) • Process improvement methodology • Java Path Explorer (NASA Ames, CICT) • Testing tool SAS'04 Technical Presentation

  13. 13 Proposals, 6 Awards • Perspective-Based Inspections • ISS Power System Analysis – USA • 3 Space Flight Missions – GSFC • C Global Surveyor • ISS Science Applications – ARC • Several flight software projects (also will use Coverity SWAT) – MSFC • Orthogonal Defect Classification • Deep Space Network Antenna Controller software. - JPL • CodeSurfer • Inertial Navigation/GPS for ISS, Shuttle, X38/CRV – JSC Important applications, good matches to the technologies SAS'04 Technical Presentation

  14. Status of example collaborations • ODC at JPL • Perspective Based Inspections at GSFC and USA • CGS at MSFC and ARC SAS'04 Technical Presentation

  15. Orthogonal Defect ClassificationPoC: Robyn Lutz, JPLFunding: SARP and National Science Foundation • What problems does ODC address? • Process improvement – learning lessons from defect logs • Currently: defect logs in many incompatible formats • With ODC: generalized schema for defect logs • What is it • Method for analyzing software bugs to determine patterns and improve software development process • First developed ~1990 by Ram Chillarege at IBM, now widely used in industry • When faults are first seen: • record “activity” and “triggering event” • When faults are fixed: • record “target” and “type” of fix SAS'04 Technical Presentation

  16. Orthogonal Defect Classification(continued) • Features • Language, platform independent • Produces customizable Excel graphs • Much local expertise • Useful to single project or to organization • Benefits • Provides quantitative basis for process improvement • Establishes a baseline for patterns of software defects • Much less expensive than root-cause analysis • Provides guidance in allocating funds for post-launch maintenance • Enables effective corporate memory SAS'04 Technical Presentation

  17. 167 IARs from the testing of the original software 167 IARs from the testing of the original software 27 IARs (randomly selected) 4 3.5 3 2.5 2 1.5 1 Function/Algorithm Assignment/Initialization Testbed environment 0.5 Unknown Timing 0 Script Nothing Fixed Interface AMC Software AMT Software APC Software APS Software CTL Software PLC Software PRS Software 167 IARs from the testing of the original software classified, presented and analyzed DRAFT ODC “schema” customized for project ODC “schema” customized for project ODC “schema” customized for project ODC’s generic “schema” Study IARs and classify Results reviewed and schema augmented 27 IARs classified and presented COMPLETED TO DATE UNDERWAY FUTURE IAR logging system includes custom ODC schema as fields with pull-down menus IARS logged during reuse of software, ODC classified as they are logged! Sample IARS presented and analyzed classify All IARS presented and analyzed Blue = (primarily) researchersRed = (primarily) project IAR = Internal Anomaly ReportODC = Orthogonal Defect Classification

  18. Collaboration Highlights • Good News • Progression so far gone well, including involvement of project. • Disappointment • Other factors place competing demands on project personnel time. Consequently, need to delay some work beyond this quarter. However, the end is in sight for interfering factors. • Excellent News • JPL’s “Software Quality Initiative” (SQI) paid for Carmen Mikulski’s time to train Belinda Wilkinson on ODC. SQI and ODC task now teamed to work with Antenna Retrofit Task. • SARP funding leveraged by SQI • Enterprise interest in ODC stimulated by SQI • Interactions with project encompass both ODC and enterprise measures (avoids duplication of effort, increases utility for project) SAS'04 Technical Presentation

  19. Perspective-Based (PB) InspectionsForrest Shull et al., Fraunhofer Center (SARP) • What is it • An approach to software inspections that has been shown to increase the number of defects detected and removed. • What problem does it address • Efficiency of inspection process • Approach: • As in other inspections, planners identify key stakeholder perspectives for the document and ensure they are represented at the inspection. • Unlike other inspection processes— • Inspectors on a team gets different preparation aids. Each inspector is responsible for only a logical subset of the document (based on the perspective they represent). • Each perspective has a defined procedure (based on development activities), not a checklist of items, to help in defect detection. • Perspectives help increase team buy-in by relating inspection activities directly to their development activities/experience. • The written procedures are an “experience base” of tested practices that evolve over time • A written record of expert practices usable by less experienced team members SAS'04 Technical Presentation

  20. Two Collaborations Underway GSFC/FSB Status: Based on interviews with personnel recommended by Branch Head Elaine Shell, requirements inspection standards and FSB-specific procedures have been developed. On target for delivery to Branch Change Control Board for review, July 7. • Tech Transfer Plan for PB Inspections • Understand team-specific quality concerns (via interviews with team lead & project personnel) • Understand likely perspectives for the team (via interview with team lead) • Refine set of perspectives and define procedures for each (via interviews with project personnel). E.g., • What quality concerns map to which perspectives • What is a feasible/effective process for checking those concerns • Finalize procedures and provide training • Analyze ongoing inspection results (and update procedures if necessary) • Monitor downstream defect profile • Write final report USA Status: Dr. Shull visited Houston to provide training in the tailored PB procedures, June 17. A practice PB inspection by the team found several minor defects in project code. Data collection mechanisms are in place and requirements inspections are beginning. SAS'04 Technical Presentation

  21. Perspective Examples • Code inspection perspectives developed for USA’s SPEED project: • “SPEED expert” developer: Focus on checking code correctness and identifying areas for possible reuse. Procedure for the inspection involves using his/her prior experience to identify the most complex/important classes, then tracing the flow of control. • “SPEED novice” developer: Focus on understanding the necessary system functionality and checking whether the code achieves that functionality. Procedure involves using the use cases to trace the functionality to the code. • Designer: Focus on verifying that good design practices were used to structure the code. Procedure is to review the requirements, then compare to the architecture information to understand how those requirements are met. • COTS-expert: Focus on whether all opportunities for reuse have been exploited, and whether the interface between SPEED and the COTS packages is correct. Procedure is to identify the use cases that include reusable functionality, then trace into the code. • Other perspectives developed so far for USA SPEED requirements, and GSFC FSB requirements. SAS'04 Technical Presentation

  22. I shouldn’t have turned off the engine so soon… AbadlyinitializedvariablecausedMarsPolarLandertocrashonMars C Global SurveyorG. Brat & A. Venet, Automated Software Eng Group, NASA ARCFunding: Code R, Communications, Information, and Computing Technology Program, Intelligent Systems Project • What is it • Static analysis tool for finding defects in C applications • Based on “abstract interpretation” • What problem does it address • Fast, precise code analysis to detect defects that are hard to find through testing • Out of bounds array, pointer access • Uninitialized pointers SAS'04 Technical Presentation

  23. C Global Surveyor (continued) • Technology originally tuned for Mars PathFinder • shallow data structures • interprocess communication via message passing • Features • Scalable to at least 600K LOC applications • Precise alias analysis • Complete path coverage • Very low false positive rates • Can detect specific classes of errors in flight software with expected 90% reduction in testing required for these errors • Analysis can be tailored to specific software architecture • Successes: Identified many undefined variables in • 130K SLOC Mars Pathfinder in 45 min (for the complete system) • 280K SLOC DS-1 code in 2 hours (also for the complete system) x SAS'04 Technical Presentation

  24. Two Collaborations Underway • ARC application: Habitat Holding Rack Interface Controller • Found an uninitialized array of file pointers. • Project wants to continue its use. • MSFC: no bugs yet found in applications; CGS upgrades to be installed at MSFC. • Feedback to development team: needs to be extended to handle • Bit fields • include files of source code • Hardware pointers SAS'04 Technical Presentation

  25. Research Infusion 2005Technologies and Collaborations SAS'04 Technical Presentation

  26. 2005 Technologies • Software Cost Reduction (Connie Heitmeyer, NRL) • Toolset for formalizing, analyzing and verifying software requirements specifications • SpecTRM (Safeware Engineering) • Toolset for formalizing, analyzing and verifying software requirements specifications • Software Architecture Evaluation (Fraunhofer) • Tools and methodology for verifying that source code implements the intended design • Usability & Architecture* • Methodology for verifying consistency of architecture with usability requirements • Orthogonal Defect Classification for NASA* • Process improvement methodology * Re-offered from last year’s list SAS'04 Technical Presentation

  27. 2005 Technologies (continued) • UML Tools (Siemens) • Design quality evaluator • Test Development Environment • MATT (Joel Henry, U. Montana, SARP) • Verification and testing tools for Matlab models and Matlab-generated code • FLUID (William Scherlis, CMU, HDCP) • Static source code analysis technology for detecting race conditions & certifying absence of race conditions in Java code. • CodeSurfer * • Reverse engineering/debugging toolset * = Re-offered from last year’s list SAS'04 Technical Presentation

  28. Proposals for 2005 Collaborations • CodeSurfer for Space Science project at IVVF • SCR for Station science payload display at ARC • SpecTRM for mission design team at JPL • UML Design Advisor/Test Development Environment for James Webb Space Telescope Science Instrument Module at GSFC SAS'04 Technical Presentation

  29. Success Criteria for Research Infusion SAS'04 Technical Presentation

  30. Success Criteria for Research Infusion 1. The success criteria of the collaboration projects funded under this proposal are met. 2. The research product is adopted by the collaborating software development team for current use. (USA, ARC) 3. The research product is included in a list of recommended development practices at a NASA Center or by contractor. 4. The software development team using the product provides feedback to the research team. (MSFC,ARC) 5. Six months after the funded collaboration period, the research product is still being used by the development project 6. The research and user teams recommend to the CTO methods of making future versions of the research products available within NASA . 7. Independent of the success of the collaborations, “lessons learned” regarding the challenges and success factors for software development technology infusion are available within NASA. SAS'04 Technical Presentation

  31. Lessons Learned • There are developers who will bring on new software engineering technologies when only the cost of insertion is covered. • Tightly structured proposal process + pre-submission screening & discussions with proposal team  high-quality proposals • “Small” awards ($25 – 50K) seem to be having a significant impact in infusing software engineering research. • Tech infusion takes a back seat to project goals • We need to expand communications channels with development teams • ViTS is not enough, need much broader access to software development community • Entry cost (including formal proposal submission) is sometimes high enough that a contractor will forgo funding to take advantage of the opportunity. SAS'04 Technical Presentation

  32. Summary • Model for transfer of mature, easily adoptable, technology • Concise presentation to attract customers • “Small”, customer-motivated pilot projects • This year’s collaborations will probably be successful. • Some have already borne fruit. • Seeking • new technologies • new customers (new ways to reach them). SAS'04 Technical Presentation

More Related