1 / 79

CIS 573 Software Engineering

CIS 573 Software Engineering. Carl A. Gunter Fall 1999. Contact Information. Course Web Page: http://www.cis.upenn.edu/~cis573 . Course announcements. Lecturer: Carl Gunter. Office hour: Thursday, 12:30-1:30, 370 Moore, 898-9506 Graduate Assistant: Mike McDougall.

Download Presentation

CIS 573 Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CIS 573Software Engineering Carl A. Gunter Fall 1999

  2. Contact Information • Course Web Page: http://www.cis.upenn.edu/~cis573. • Course announcements. • Lecturer: Carl Gunter. • Office hour: Thursday, 12:30-1:30, 370 Moore, 898-9506 • Graduate Assistant: Mike McDougall. • Office hour: Wednesday 1:00-2:00, Moore 057a, 898-8116

  3. What will I learn in the course? • Software engineering generally. • Safety engineering as it offers lessons and ideas for software. • General principles for building safety critical software systems. • Techniques to achieve high confidence. • How to analyze accidents.

  4. Pre-requisites • Interest in both software and the systems in which it is used. • Programming in Java. • Basic skill in mathematics; ability to learn some logic.

  5. What am I Expected to Do? • Participate in classes. • Read designated materials. • Projects: individually or on a team. • Final Exam: assesses understanding of lectures, reading, and project presentations.

  6. Do Silly Icons Help? Yes!

  7. Participation and Reading • Slides distributed on course web page • Textbook: Leveson, Safeware: System Security and Computers. • Other materials will be distributed.

  8. Projects • Achieving confidence. • Verifying software. • Specifying software. • Coding from a specification. • Testing software.

  9. Project Rules • No partner on first project. • Groups of two are allowed on all subsequent projects, but your partner must be different on each project. • Partners provide equal effort on a project.

  10. Verification • Computer hardware and software can be mathematically described. • Hence, computers can be used to automate the verification of computer hardware and software systems.

  11. Verification and Testing • Testing is like verification since each successfully-passed test is like a little theorem that has been proved about the implementation. • Verification has the capacity to cover large sets of cases exhaustively, eliminating the need for coverage conditions or statistical measures of confidence.

  12. Verification by Reading Product Project Technical Review Testing Software Inspection Management Review Simulation Formal Verification Walkthrough Audit From the IEEE Standard for Software Reviews and Audits

  13. Verification and Validation • Verification can be used to show that the software or hardware conforms to a rigorous description of its expected behavior. • It cannot show that the behavior described is the one the user wanted. • Verification: building the system right • Validation: building the right system

  14. First Assignments • Reading: Chapters 1,2,3,4 of Leveson • Project: Dekker, correctness of mutual exclusion algorithms

  15. Recommended for Fun and Profit • The Psychology of Everyday Things. Donald A. Norman, Basic Books, 1988. • Peter G. Neumann, Computer Related Risks, Addison Wesley, 1995. Drawn from the bulletin board: news:comp.risks. • Normal Accidents. Charles Perrow, Basic Books, 1984. • The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage. Clifford Stoll, Mass Market Paperback,1995. • Bad Bytes: Why We Should Not Depend on Software. Lauren Ruth Wiener, Addison Wesley, 1993.

  16. What is Risk? Probability of Failure * Loss from Failure Mitigate risk by increasing reliability or decreasing severity.

  17. Risk and Opportunity • Many opportunities are held back by the expense of high risk. • Better assurance techniques break these barriers. • When is the risk low enough?

  18. Risk and Opportunity Now • Fuel Injection and anti-lock breaking • Fly-by-wire aircraft and computer-controlled landings • Reduced time gaps between trains • Credit card purchases on the web and banking online • Online shareholder voting

  19. Risk and Opportunity in the Future • Intelligent vehicles and highways • Electronic wallets • Genetically engineered organisms

  20. Course Strategy • V V&T is a technology for increasing confidence in a system. • Its most vigorous application is in areas where the cost of failure is high. • We will focus primarily on software of this kind.

  21. “High Risk” Computer Systems • Safety critical • Transportation and power systems • Medical and emergency systems • Security critical • Military systems • Electronic commerce • Mission critical • Key information systems • Key control systems

  22. Low Risk Systems • When is a system non-critical? It is subjective and depends on use. • Some software is not strongly backed by its maker. Here is a standard industry disclaimer: The entire risk as to the quality and performance of the program is with you. Should the program prove defective, you … assume the entire cost of all necessary servicing, repair or correction

  23. Low Risk Systems continued • A refreshingly straightforward disclaimer: We don't claim EasyFlow is good for anything---if you think it is, great, but it's up to you to decide. If EasyFlow doesn't work; tough. If you lose a million because EasyFlow messes up, it's you that's out the million, not us. If you don't like this disclaimer; tough. We reserve the right to do the absolute minimum provided by law, up to and including nothing.

  24. Rigid Distinctions? • There are significant differences between the classes of high-risk systems. • Analysis of energy for safety systems. • Concept of an adversary in security systems. • But there are also many common themes. • Reliability of components. • Replication. • Backup. • Controlled failure modes.

  25. Case Study Clayton Tunnel Needle Telegraph Signal Man Semaphore A Tunnel B Signals In! Train in tunnel. Clear! Tunnel is free. Ok? Has the train left the tunnel? Gerard Holtzman Design and Validation of Computer Protocols

  26. In! OK? Clear!

  27. Buzz! In!

  28. In! OK? Clear!

  29. Classes of Risks • Business Risk • Inadequate consumer interest in product • Standard for product controlled by competitor(s) • Project Risk • Inadequate time • Inappropriate personnel • Operational Failure Risk • Unavailability • Erroneous operation

  30. Risk Factors • Appearance of new hazards • Increasing complexity • Increasing exposure • Increasing amounts of energy • Increasing automation of manual operations • Increasing centralization and scale • Increasing pace of technological change

  31. Acceptable Risk • When is risk low enough? • Risk-benefit analysis does not resolve moral issues. • Often the people taking the risk are not the ones benefiting from the opportunity. • Can we walk away from technical opportunity?

  32. Computers as Chameleon Machines

  33. Role of Computers • Providing information or advice to a human operator upon request. • Interpreting data and displaying it to the controller, who makes the control decisions. • Issuing commands directly, but with a human monitor of the computer’s actions providing varying levels of input. • Eliminating the human from the control loop.

  34. Terminology • Operator • Process (under computer control) • Control • Display • Sensor • Actuator

  35. Four Roles for the Computer

  36. Less Obvious Implications with Indirect Control

  37. Complexity of the functions required. Conformity to existing artifacts and standards. Changeability of functions required. Invisibility of the software artifact, making it hard to model or visualize. What Makes it Hard to Build Software? Fred Brooks No Silver Bullet---Essence and Accident in Software Engineering

  38. Software is primarily a design, with no manufacturing variation, wear, corrosion or aging aspects. It has a much greater capacity to contain complexity. It is perceived to be easy to change. Software errors are systematic, not random. It is intangible. What Makes Software Different? Motor Industry Software Reliability Association (MISRA), Development Guidelines for Vehicle Based Software.

  39. The Curse of Flexibility

  40. The Concept of Causality The cause of an event is a set of conditions, each of which is necessary and which together are sufficient for the event to occur. Individual conditions are called causal conditions or factors.

  41. Chemical Process Industry • Classification of hazards: • fire • explosion • toxic release. • Factors influencing risk: • Size of inventory • Energy • Time • Intensity/distance relationship • Exposure

  42. Case Study Bhopal • In December, 1984, release of methyl isocyanate (MIC) from Union Carbide chemical plant in Bhopal India resulted in the worst industrial accident in history. • MIC is used in pesticides. • Demand for MIC pesticides had dropped after 1981 so plant was experiencing budgetary cutbacks.

  43. Storage of MIC 610 619 611 50 tons 21 tons 1 ton (thought to contain 15 tons) (thought to contain 20 tons) Capacity: 60 tons Limit: half full Temperature: 0°C Pressure: 3psi

  44. Backups • Vent gas scrubber. • Flare tower. • Water curtain. • Siren.

  45. Events • 10.30pm, December 2, 1984. A new worker was cleaning some valves. • 11.00pm. Pressure was 10psi, temperature was 20°C. • 11.30pm. Leak was discovered, workers notice eye irritation. • 12.40am. 40psi, 25°C, rumbling noise in tank, concrete casing cracked. Then 400°C, began release of 50,000 pounds of MIC gas. • 12.50am-12.55am. Siren sounded when MIC seen escaping from vent stack.

  46. Cause and Effect • 2,000 to 3,000 people killed, 10,000 with permanent disabilities, 200,000 injured. • Blamed by management on `human error’. • Masking the complexity of causal factors.

  47. Over-simplification • Human error. • Technical failures. • Organizational factors. • Multiplicity of factors. • Legal financial responsibility.

  48. Legal View • Cause in fact is established by evidence showing that a defendant’s act or omission was a necessary antecedent to plaintiff’s injury. • Legal (or proximate) cause is a device for limiting liability of a defendant to consequences bearing some reasonable relationship to the risks he or she created.

More Related