1 / 21

Risk Assessment and Perception

Risk Assessment and Perception. Risk Assessment and Social Control of Technology The Meaning of Risk Risk Assessment Expert Analysis vs Public Perception 3 Ways of Looking at Risk Managing Risk. Introduction. Why is risk assessment important? Most technologies have some elements of risk

meg
Download Presentation

Risk Assessment and Perception

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Risk Assessment and Perception • Risk Assessment and Social Control of Technology • The Meaning of Risk • Risk Assessment • Expert Analysis vs Public Perception • 3 Ways of Looking at Risk • Managing Risk

  2. Introduction • Why is risk assessment important? • Most technologies have some elements of risk • Public perception of risk can influence whether / how technologies are deployed • We should be considering ways of managing risk to • prevent harm to people, environment, • avoid the worst consequences of technology, • improve quality of life

  3. The Social Control of TechnologyActors and Mechanisms Government Risk Acceptance Public Real Risks Deploy Regulate RiskPerception TECHNOLOGIES WITH RISK Influence Recommendations RiskAssessment Media RiskCommunication Professionals/ Experts Clear Connection Actor

  4. Coursepack reading: Wolfe et. al. (2002) • It is difficult to predict which technologies are negotiable and which are totally out of the question • Most literature focuses on individual aspects • risk communication, • negotiation and conflict resolution, • models of public participation • Set out to define a more comprehensive framework for understanding public acceptance. It is a combination of all of these factors and others. • Risk acceptance and perception is not the only issue, but they do suggest it is very important

  5. The meaning of risk • Narrow definition: • Many measure/definitions of risk focus on the effect on humans, especially the chance of causing death • Broader definition: • “the chance of adverse outcome to human health, the quality of life, or the quality of the environment” (Graham and Wiener 1995, p.23)

  6. Two main elements to risk • The probability of some event • The degree of consequences of that event

  7. 4 Categories of Risk (Lewis 1990) • Familiar high risks • e.g. driving and hang-gliding • Risks of low probability and large consequence • e.g. dam failure, earthquake • Very low probability, very large consequences • e.g. destructive change in global climate • Risks buried in a background of “natural occurrence” • e.g. cancer due to environmental contaminants

  8. Risk Assessment • Informal risk assessment • we do this all the time without calculation • Formal risk assessment (experts) • Identification of possible hazards • How could the hazard occur? • Assess probability of hazardous events • Assess consequences of the event • Risk = probability * consequences • SIMPLE?

  9. 4 Categories of Risk (Lewis 1990) • Familiar high risks • Probabilities easier to determine (lots of data/experience) • Consequences are more difficult (e.g. subjective judgments on quality of life) • Risks of low probability and large consequence • Probabilities also difficult to determine (use of event trees) • Consequences equally difficult • Very low probability, very large consequences • Guesswork in assessing probabilities • Very severe consequences (we can only imagine) • Risks buried in a background of “natural occurrence” • Difficult to separate consequences due to technology from natural consequences

  10. An example event tree (see Rasmussen, 1990)

  11. Ranking of Risky Technologies • Choose the ten riskiest technologies • Rank them from 1 – (most risky) to 10 (least risky)

  12. Why the difference?

  13. For each of the 30 technologies they were asked: 1) The number of people likely to die next year 2) In a particularly disastrous year? 3) The degree to which the activity’s risks were: • voluntary, • controllable, • known to science, • known to those exposed, • familiar, • dreaded, • certain to be fatal, • catastrophic, • immediately manifested There was some convergence between experts and laypersons on third set of questions

  14. The crucial difference found between experts and lay persons • Dread and the unknown were recognized by experts, but not thought relevant in judging riskiness • Experts valued probability of death highly • For the lay persons, the concept of “dread risk” played a large role in the judgment of risk • Less value placed on the probability of death

  15. Dread Risk is associated with: • Lack of control over an activity • Fatal consequences if there were a mishap • High catastrophic potential • Inequitable distribution of risks and benefits • Belief that risks are increasing and not easily reducible

  16. Three ways of looking at risk assessment(Perrow, 1984) • Absolute Rationality • Calculations can be made (by experts) about risks and benefits • Public is irrational / ill-informed • Limited or “bounded” rationality • The public has limits to their knowledge • We use heuristics to inform our opinions about risk (media, recent events, experience) • the public’s fears must be treated with respect and considered in making policy, but the gap is to be closed by bringing the public over to the experts’ side • Social Rationality • Limited knowledge is a good thing • Brings about social bonding (i.e. we must learn from each other) • Fosters new perspectives and different values

  17. Managing Risk • How do we reduce risk? Prevention and Mitigation lower probability lessen consequences regulation

  18. Principle of Acceptable Risk • People should be protected from the harmful effects of technology, especially when the harms are not consented to or when they are unjustly distributed, except that this protection must be balanced against: a) the need to preserve great and irreplaceable benefits and b) the limitations on our ability to obtain informed consent (Harris et. al. 1995, p.243)

  19. The Precautionary Principle Emanates from the wish to protect humans and nature, even if there is no scientific evidence of the extent and cause of the environmental problem (Ministry of Environment and Energy, Denmark, 1998) When there is a threat of serious or irreversible environmental damage, the lack of scientific certainty should not be used as a reason for postponing action to prevent environmental degradation (Bradley 2000)

  20. Other issues in managing risk • Risk tradeoffs: • E.g. hospital care  may cause other illnesses • Media influences public risk perception. To what extent is media responsible for technology policy decisions? • Informed consent: people should be fully informed if they are expected to accept risk. Yet proliferation of warning labels may be shifting attention to trivial risks • Quantifying risk – to make decisions it is helpful to quantify risk. But it seems morally repugnant to quantify the value of life, especially when different lives are assigned different values. • Should policy be directed at protecting the least powerful, or should it be aimed at saving the maximum number of lives?

  21. Resources Bradley, M. The Precautionary Principle: A definition. CANFOR. (http://www.opendoors.cppa) (visited Feb 10, 2000) Graham, John D. and Jonathan B. Wiener. 1995 Risk vs. Risk: Tradeoffs in Protecting Health and the Environment. Cambridge, MA: Harvard University Press. Harris, Charles, Michael Pritchard and Michael Rabins. 1995. Engineering Ethics: Concepts and Cases. Boston: Wadsworth Publishing Co. Lewis, H. W. 1990. Technological Risk. New York: W. W. Norton. (Thode: T174.5.L48 1990) Ministry of Environment and Energy, Denmark. 1998. Faktuelt.(http:// www.mem.dk/faktuelt/fak15_eng.htm) (visited Feb 10, 2000) Perrow, Charles. 1984. Normal Accidents: Living with High Risk Technologies. New York: Basic Books. (Thode T 54.P47 1984) Rasmussen, Norman C. 1990. The application of Probabilistic Risk Assessment Techniques to Energy Technologies. In Readings in Risk. (Glickman, Theodore and Michael Gough, eds.) Washington, DC: Resources for the Future. (Thode T174.5.R38 1990) Wolfe, Amy K., David J. Bjornstad, Milton Russell and Nichole D. Kerchner. 2002. A framework for analyzing dialogues over the acceptability of controversial technologies. Science, Technology and Human Values, Vol. 27, No. 1, Winter, pp.134-159. (Coursepack)

More Related