1 / 20

MEM 604: Social, Legal and Ethical Considerations for Engineering

MEM 604: Social, Legal and Ethical Considerations for Engineering. Managing Safety and Liability. Anticipatable Risk?. How would an engineer assess the differences between the 1945 crash of a B-25 into the Empire State Building and the attack by 727s on the World Trade Center?

dconwell
Download Presentation

MEM 604: Social, Legal and Ethical Considerations for Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MEM 604: Social, Legal and Ethical Considerations for Engineering Managing Safety and Liability

  2. Anticipatable Risk? • How would an engineer assess the differences between the 1945 crash of a B-25 into the Empire State Building and the attack by 727s on the World Trade Center? • What is the significance of the efforts to weaken building codes and apply untested design principles to maximize commercial prospects? • Could they or should they have anticipated the possibility of the terrorist attacks?

  3. Liability and Safety • Risk is an ineliminable feature of engineering. • That is true for the use of well established principles and substances. • It is particularly true for innovative techniques and products. • In response, engineers are required to: • anticipate and estimate risk; • be conscious of their willingness to tolerate risk; • define acceptable risk; • be aware of the principles of legal liability.

  4. Starting with the Codes • For engineers, the natural place to start a discussion of safety and risk is with the various codes of ethics that govern professional behavior. • These codes of largely consistent in placing a high premium on safety. In as much as safety and risk are inversely related to each other, the more substantial the safety measures in place the smaller the risk, and vice versa.

  5. Codes and Safety • Though the codes consistently require engineers to emphasize safety in their designs and oversight responsibilities, as is typical with codes, the generality of their guidance may not be sufficiently action guiding. • Cf. NSPE code, p. 376, c. 2. • In addition to general concerns for safety, another common element of Codes is emphasis on the significance of informed consent. • As part of commonly accepted practice is the notion of “factors of safety:” 6x multipliers are the norm.

  6. Local Building Codes • In addition to professional codes, engineers are obligated to design in conformity to local building codes. • Failure to do so is a significant moral failure. • Additionally, engineers who identify limitations in existing codes have an obligation to pursue changes to the code. • Compare the examples of the Twin Towers to that of the Citicorp Building (p.154). • Another example: Dr. Lynn Beason (155-6).

  7. The Challenges of Risk Assessment • As clearly as we can specify an engineer’s obligation to public welfare, there is nonetheless often considerable ambiguity in correctly assessing safety implications. • This is largely due to the difficulty of assessing risk. • Though we can never remove the uncertainty of the activity, there are a number of techniques that can be employed to reduce it.

  8. Fault Trees • One method that engineers can use to help them identify and assess risks is the Fault Tree. • A fault tree is a diagram specifying the failure modes of a system. • Failure Mode: a way in which a system, structure or process can malfunction (156). • They aid in the systematic analysis of the failure modes of a project, to the extent that they can be anticipated.

  9. Example of a Fault Tree • www.relexsoftware.com/ resources/art/art_fta.asp

  10. Event Trees • Another device that can be used to help identify and assess risks is called an Event Tree. • An event tree is a diagram which traces out the possible implications of a hypothetical system, structure or process failure. • The advantage of such diagrams is their ability to include quantitative analysis of risk.

  11. Example of an Event Tree • http://www.adventengineering.com/images/Other_images/NucEventtree.gif

  12. Problems with these Techniques • Both of these methods have important limitations. • It is difficult to anticipate all of the failure modes. • It is difficult to anticipate all human errors. • It is difficult to assign probabilities to all of the possible events.

  13. Normal Accidents? • The difficulties attendant on the kind of fault analysis we’ve reviewed has led some to argue that we need to stop thinking of accidents as anomalies and begin to understand them as part of the normal course of affairs. • This particularly true of high risk technologies, due to two factors: • Tight Coupling: processes closely linked. • Complex Interactions: interactions between elements difficult to predict. • Cf., example of New York Telepone, p. 160-1.

  14. Normalized Deviance • The challenge posed by high risk technologies can be exacerbated by the tendency of people familiar with the risks to tolerate it. • The tolerance has been labeled Normalized Deviance. • Something like this seems to be at the root of the Challenger disaster (see p. 162).

  15. Acceptable Risk? • The concepts of Normal Accidents and Normalized Deviance underscore the sense in which we must all be willing to accept some level of risk. • Risk: the product of the likelihood and the magnitude of the harm (163). • The question becomes: What is acceptable risk? • Different groups are going to have different answers.

  16. Experts, Utilitarianism and Risk • For the expert, some version of Utilitarian analysis is usually employed to identify acceptable risk. • From this perspective, acceptable risk can be defined as, “…one in which, given the options available, the probability of producing harm is at least equaled by the probability of producing benefit” (165). • The problems with this approach are essentially the same problems that utilitarianism itself is faced with (measurement, justice).

  17. Laypersons and Risk • Humans are notoriously bad at estimating risk. Experts have a real advantage here. • Non-Experts do as carefully isolate their evaluation of risk from their consideration of other potentially relevant factors. • To some extent, the differences may be attributable to ignorance, but to a large extent they are doubtless attributable to different contexts of evaluation.

  18. Respect for Persons: Informed Consent • Laypersons are usually more committed to a Respect for Persons approach. • Key to such an approach is the concept of Informed Consent • Informed consent occurs when three conditions are met: • No coercion; • Relevant information is provided; • Rational and competent evaluation is possible.

  19. Respect for Persons: Justice • At the heart of the notion of informed consent is a concern for justice. It doesn’t seem right that some individuals should bear the burden for the group. • For the layperson, acceptable risk can be defined as, “…one in which risk is assumed by free and informed consent, or properly compensated, and which is justly distributed” (170).

  20. Engineers and Liability • Standard of Proof in Tort Law: Preponderance of the Evidence. • Guiding Legal Principle in Liability Cases: Proximate cause. • Despite these (low) standards, tort law may still not provide enough protection for consumers or end-users of a product.

More Related