1 / 14

Forrest Shull, Fraunhofer Center for Experimental Software Engineering - Maryland

Software Inspections at NASA: Lessons Learned Report on State of the Practice. Forrest Shull, Fraunhofer Center for Experimental Software Engineering - Maryland Judith Bachman, Computer Sciences Corporation John Van Voorhis, Fraunhofer Center for Experimental Software Engineering – Maryland

bnora
Download Presentation

Forrest Shull, Fraunhofer Center for Experimental Software Engineering - Maryland

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Inspections at NASA: Lessons Learned Report on State of the Practice Forrest Shull, Fraunhofer Center for Experimental Software Engineering - Maryland Judith Bachman, Computer Sciences Corporation John Van Voorhis, Fraunhofer Center for Experimental Software Engineering – Maryland Mike Stark, GSFC John Kelly, JPL

  2. Project Information • This initiative • a collaboration between JPL, GSFC, CSC, and FC-MD • funded by the Office of Safety and Mission Assurance (OSMA) Software Assurance Research Program • Major objectives include: • A lessons learned report about current state-of-the-practice in inspections at NASA • Running of experiments to investigate tailoring and training of new inspection techniques • Running and support of pilot studies to provide long-term support for piloting new inspection techniques

  3. Lessons Learned: Process “Inspection” defined broadly: Any technical examination process during which a product is examined for defect detection by more than just the author. • Analyzed existing process recommendations. • Contacted key personnel at centers for potential participants, then participants directly. • Distributed characterization questionnaire. • Performed semi-directed interview - elicit processes, attitudes, experiences. • Analysis of qualitative data for lessons learned.

  4. Lessons Learned: Subjects • 17 subjects • 9 GSFC & Wallops; 4 JPL; 2 GRC; 1 JSC; 1 LRC • 7 years on average at current center • Majority participated in >10 inspections at that center • Projects: • Flight SW; data capture; control centers; mission planning • Team size mostly <10 people; 2/3 of projects last longer than 1 year. • ¾ of respondents currently doing inspections. • Not a comprehensive or random sample… • But experienced people and representative environments.

  5. Characterization Results (1) • What do people inspect? • Most often: requirements, high-level design, low-level design (14/17) • Also test plans (12/17) and code (13/17) • Only 1 picked code inspections as most beneficial: took over management in mid-development. • Other respondents who inspect code • develop many small projects with little mission criticality • don’t have chance to influence requirements. • Teams tend to start inspections in the first phase where: • They have input; • The document is sufficiently stable and complicated

  6. Characterization of Development Errors

  7. Characterization Results (2) • Why do people inspect? • “Changing/missing/misinterpreted requirements”: on average most important, most often mentioned as source of errors. • “Unstable requirements” also a major development problem. • “Coding errors” were mentioned by everyone but consistently ranked least important. • Most important thing to look for in any phase is consistency with the previous documents. • How do people inspect? • Almost all knew of or trained on a process. (JPL, SSDM, RA) • 9 used a process. • Very few use checklists. • Only 8 ever collected metrics. • No correlation with size/mission criticality. • Those who still do use relatively simple metrics. • Simple tools for those who do: websites, Excel, Access…

  8. Characterization Results (3) • Do people want to inspect? • Not at first. • Many were convinced by being involved in effective inspections. • No failing projects with inspections • Some used inspections to repair projects • Several subjects felt that all successful development required inspections.

  9. Lessons Learned Results (1) • CommunicationBenefit - Most important • Team cohesion: Understanding who to go to with a question. • Communication to the customer • Metrics • “White box reviews” / adaptability • Getting the right technical info to the right people. • Can survive without inspections if communication happens anyway • Effective practice: combine inspections with other meetings • TrainingBenefit • Many projects required significant time for learning, had staffing issues • Team building: Getting on board • Cross-training, novice training • Defect Reduction Benefit • It happens - mainly by more communication and better trained staff.

  10. Lessons Learned Results (2) • Using Perspectives during Review • Perspective: Point-of-view and existing expertise against which a reviewer inspects a software product. • [Mars Climate Orbiter, Mishap Investigation Board Phase I Report: had inspections but critical staff were missing] • Everybody who had a choice looked to optimize the set of perspectives. • Even teams with low formality otherwise. • Some would reschedule if a critical perspective was missing • Typically, no checklists but different perspectives implicitly assumed to look for different things. • When checklists are used: • “Don’t use them as a replacement for thinking.”

  11. Lessons Learned Results (3) • Staffing / Mgmt. • Rare that initial plans include process/metrics support. • “Inadequate staffing” second-largest development problem. • Understaffed projects, even if they want inspection help, have more pressing concerns. • Functional support lacking, especially for metrics collection. • Losing follow-up: surest way to demotivate people. • Chief benefit of a more formal process is action item tracking with suitable rigor. • Support is necessary… • … and worth paying for.

  12. Implications and Next Steps • Candidates for further study: • Perspective-Based Reading • Focus on consistency in requirements/design; individual review • Communication/Training: Novice and cross-training; explicit responsibilities and expertise • Perspectives: Make important perspectives explicit; go beyond checklists • JPL Formal Inspections • Positive feedback; often formed the basis for processes • Communication: Formal assignment of roles • Staffing/Mgmt: Very strong on follow-up tracking • DaimlerChrysler: inspection “sampling” techniques • Staffing/Mgmt: Seems well-suited to software acquisition and projects on tight schedule

  13. Request for Participation • We are looking for: • Civil servants to directly participate • Civil servants who are overseeing contracts to allow/encourage contractors to participate • Controlled experiments: Participants spend 1 day for: • Receiving training in state-of-the-art techniques that can be taken away and used on real projects. • Receiving feedback on types of defects detected and effectiveness of the training, and some comparison to their usual approach • Pilot studies: Participants work with us on actual projects and: • Receive training in state-of-the-art techniques tailored for their environment & project • Receive extended support for inspections including • data collection • consultation • analysis & feedback

  14. Contact Info Forrest Shull fshull@fc-md.umd.edu 301-403-8970

More Related