1 / 35

Cleanroom Software Engineering

Andy Moyer. Cleanroom Software Engineering. Cleanroom Software Engineering. What is it? Goals Properties of Cleanroom Cleanroom Technologies Case Studies Critiques. Cleanroom Software Engineering: What is it?.

Download Presentation

Cleanroom Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Andy Moyer Cleanroom Software Engineering

  2. Cleanroom Software Engineering • What is it? • Goals • Properties of Cleanroom • Cleanroom Technologies • Case Studies • Critiques

  3. Cleanroom Software Engineering: What is it? • Set of principles and practices for the specification, development, and certification of software-intensive systems. • Based on the work of Harlan D. Mills who was an employee of IBM in early 1980s. • Influenced by Dijkstras structured programming, Nicholas Wirth on stepwise refinement, and David Parnas on modular design • Methods based on two principles: • Programs are rules for mathematical functions • Software testing is sampling

  4. Where the Idea Came From • The cleanroom model follows the ideas of manufacturing semiconductors in a “cleanroom” environment.

  5. Goals of the Cleanroom Process • Main goal: Achieve or approach zero defects • Other goals: • Prevent defects • Create correct, verifiable software • Incremental development

  6. Properties of Cleanroom • Spend a lot of time and money "up-front" preventing defects • Use statistical methods to ensure quality • Formally state and “prove” requirement needs • Specification, development, and certification may be done by separate teams depending on the size of the project

  7. Cleanroom Technologies • Incremental development under Statistical Process Control • Function-Based Specification, Design, and Verification • Correctness Verification • Statistical Testing and Software Verification

  8. Incremental Development Under Statistical Process Control • Used to reduce the risk associated with managing large systems by focusing on smaller, manageable subsystems of increments. • Each increment is developed and tested separate from the whole system before integrating into the whole system. • As increments are added, the whole system is tested. • Some amount of end-user function used to obtain customer confirmation or clarification

  9. Incremental Development Under Statistical Process Control • Increment should have these properties: • Externally executable • Contain all functionality of prior increments • Benefits: • Specification, design, and certification activities may be performed often • Timely feedback can be obtained from customer • Intellectual control over a system

  10. Cleanroom Technologies • Incremental development under Statistical Process Control • Function-Based Specification, Design, and Verification • Correctness verification • Statistical Testing and Software Verification

  11. Function-Based Specification, Design, and Verification • Objectives: • Expand the specification into implementation via small steps • Maintain intellectual control by designing data and control structures at appropriate levels of abstraction • The Three Boxes: • Black Box • State Box • Clear Box

  12. Function-Based Specification, Design, and Verification • The Box Structures • Begins with an external view – Black Box • Transformed into a state machine – State Box • Fully developed into a procedure – Clear Box

  13. The Black Box • A view of an object that hides data implementation and process implementation. It describes how a system responds to stimuli, usually in a formal specification language. • No internal state is looked at • Our good friend – Z (Zed)

  14. The State Box • A view of an object that shows data implementation but hides process implementation. It describes how "state" information is transformed. • Derived from the black box – first step to implementation • This is represented as a finite state machine

  15. The Clear Box • A view of an object that shows both data implementation and process implementation. The goal is to stepwise refine procedures and to prove them correct. • Derived from the state box • May introduce new black boxes defining major operations • Creates the required responses (outputs) to stimuli (inputs)

  16. Stepwise Refinement of Specifications to Code • Steps: • Expand clear box into lower level designs • Verify the correctness of the expanded designs • Expand low level designs into code

  17. Cleanroom Technologies • Incremental development under Statistical Process Control • Function-Based Specification, Design, and Verification • Correctness verification • Statistical Testing and Software Verification

  18. Correctness Verification • The primary debugging process for Cleanroom • Each box structure subject to correctness verification • Objectives: • Determine that the box structures correctly implements design • Remove any errors that were introduced during development • Completely review the code for completeness, consistency, and correctness

  19. Correctness Verification • So what is verified? • If the box structure behaves as defined • Correctness of each refinement • Response mapping defined in one step is preserved

  20. Cleanroom Technologies • Incremental development process model • Stepwise refinement of specifications to code • Correctness verification of developed code • Statistical Testing and Software Verification

  21. Statistical Testing and Software Verification • Each increment of compilable functionality is statistically tested to: • Ensure that it meets the quality standards defined by the development organization • Certify a level of reliability that the product will deliver in the field • Provide feedback for quality control and process improvement

  22. Statistical Testing and Software Verification • Steps taken: • Plan the test • Stratification plan which describes each layer to be developed • The sampling plan • A description of which physical testing resources are to be devoted to each increment and each stratum • Develop usage models required by the test plan • Develop and validate the usage distribution

  23. Example Usage Model

  24. NASA Case Study • In March of 1990 NASA preformed a case study of SEL Cleanroom versus their typical process • Each group applied the process to the same project – a Coarse/Fine Attitude Determination Subsystem of the Upper Atmosphere Research Satellite. • Completed system contained about 34,000 Source Lines of Code (SLOC) of primarily FORTRAN

  25. NASA Case Study-SEL Cleanroom Lifecycle

  26. NASA Case Study-Experience Comparison

  27. NASA Case Study • Testers and developers are on completely separate teams • Developers have no access to the mainframe computer for compilation and testing • Developers rely on code reading instead of unit testing to verify correctness • Testers use a statistical testing approach

  28. NASA Case Study - Results

  29. NASA Case Study - Results • Failure Rate: • Standard SEL process – 6 errors/KSLOC • Cleanroom SEL procees – 3.3 errors/KSLOC • The inexperience of the cleanroom team had no effect on the outcome • Impact of unstable requirements lessened by concentrated effort in team reviews

  30. DoD Case Study • The STARS program is a US Department of Defense (DoD) reasearch and development program • Emphasizes: • Process driven • Re-use based • Integrated software engineering environment

  31. DoD Case Study • Typical Process • Productivity measured at 121 loc/person month • Failure rate: 23~27 failures/KLOC • Cleanroom Process • Productivity measured at 559 loc/person month • Failure rate: 1 failure/KLOC

  32. DoD Case Study • Important benefits: • Improved Productivity • Improved quality • High staff morale

  33. Critiques • In 1997, Beizer argued the elimination of unit testing is contradicting “known testing theory and common sense” • Also, is it possible to find bugs without compiling/running code?

  34. References • 1.  Cleanroom Software Engineering. Retrieved from University of Texas at Arlington website: http://www.uta.edu/cse/levine/fall99/cse5324/cr/clean/page1.html2.  DACS, The Data and Analysis Center for Software. Retrieved January 5, 2009  Found at: www.dacs.dtic.mil/databases/url/key.php? keycode=64

  35. References • 3.  Green, Scott et al. (1990). "The Cleanroom Case Study in the Software Engineering Laboratory: Project Description and Early Analysis". NASA.  Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19910008271_1991008271.pdf4.  Prowell, Stacy et al. (1993). "Cleanroom Software Engineering: Technology and Process". Reading, MA: Addison Wesley Longman, Inc.

More Related