- 66 Views
- Uploaded on
- Presentation posted in: General

Uncertainty and quality in scientific policy assessment -introductory remarks-

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

- Integrated Assessment of Health Risks of Environmental Stressors in Europe

INTARESE

Uncertainty and qualityin scientific policy assessment-introductory remarks-

Martin Krayer von Krauss, WHO/EEA

- Intarese contingent
- National public institutes
- Academia
- Other: WHO, CEFIC, etc…

- EEA contingent
- Others
- Marco Martuzzi & David Gee
- Jeroen van der Sluijs & Arthur Petersen
- Jerry Ravetz and Andrea Saltelli
- Jacqueline McGlade

Science with a twist!

- Stakes are high;
- Values are in dispute;
- Facts are uncertain.
- In a situation where very few of our assessments can truly be “validated”, and where the consequences of error could be far reaching, how do we ensure the quality of our work?

Provide you with an understanding of:

- The context within which science for policy is conducted;
- The rational for the interest in uncertainty and quality in policy assessments;
- Qualitative as well as quantitative conceptions of uncertainty;
- Qualitative and quantitative approaches to managing uncertainty and quality;
- The relationship between uncertainty, quality and stakeholder participation.

- Lecturing
- Discussions
- Exercise
- Workshop dinner!

- Chair: Gordon McInnes, Deputy Director, EEA
- Presentations:
- Arthur Petersen, Director of the Methodology and Modelling Programme, MNP
- Jeannette Beck, Program Leader on Air Quality, MNP
- Jan Wijmenga, Netherlands Ministry of Housing, Spatial Planning and the Environment

- Discussion: Quality and uncertainty management needs in science for policy
- Sigfus Bjarnason, Head of group, EEA

- Lecturing (30 minutes)
- Introduction by Jerry Ravetz
- Reflexivity and the Reflexive Practitioner (AP)
- Levels of uncertainty (mkvk)

- Discussion (30 minutes)

Location

Nature

Level

See: Walker et al., (2003).

J. of Integrated Assessment, 4 (1): 5-17.

- Refers to the location at which the uncertainty manifests itself in the model
- Examples of common assessment models:
- Risk Assessment:
Risk = Probability x Consequence

- Environmental & Health assessment of chemicals:
Risk = Exposure x Effect

- Risk Assessment:
- Generic model locations:
- Context (e.g. boundaries, framing)
- Input data
- Model uncertainty
- Calibration data
- Parameter uncertainty
- Model output (conclusion)

Location

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Nature

Level

e.g. see Knight, 1921; Smithson, 1988; Funtowicz and Ravetz, 1990; Faber et al., 1992; Wynne, 1992; Schneider & Turner, 1994; ESTO, 2001.

= .5

= 0

- There exist solid grounds for the assignment of a discrete probability to each of a well-defined set of outcomes.
- We have a well known functional relationship
- We have an adequate combination of: (i) number of parameters and (ii) amount and character of data.

?

?

?

2

-1

0.2

0.5

0.8

1.1

1.4

1.7

-1.9

-1.6

-1.3

-0.7

-0.4

-0.1

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?

- We can describe a set of outcomes to be expected, but we cannot associate probabilities very well.
- Assumptions; various plausible scenarios; unverified “what if ?” questions;
- Ambiguous results.

Scenario: resistance to antibiotics

- The widespread use of the antibiotics could lead to the development of resistant bacterial strains;
- In the long run, antibiotics would no longer be effective in the treatment of disease in humans.

- Scientific evidence: the development of bacterial resistance can take place.
- How quickly and to what extent will it become a problem?

The outcome is clear,

the probability of it occurring is unknown.

Location

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Nature

Level

- We do not know the essential functional relationships.
- There exist neither grounds for the assignment of probabilities, nor even the basis for the definition of a complete set of outcomes.
- More information may become known later through research, but little is known for the time being.
- Recognized ignorance:
We know that we don’t know!

Consider the case of a scientist asked to assess the risks or the costs of BSE at the time of its discovery in 1986.

- No historical data on BSE was available and scientific understanding of how the disease is contracted was limited.
- The extent of the public outcry that would eventually occur remained unknown, as did the extent of the loss of exports and the drop in domestic demand that ensued.
- Knowledge on the relationship between BSE and CJD would not become available for another 10 years.
Any assessment would necessarily rely on a large number of assumptions, there would be no credible basis for the assignment of probabilities.

There would not even be a credible basis to claim that all of the potential outcomes of the BSE epidemic had been thought of.

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Different levels of uncertainty call for different approaches to uncertainty assessment and management!

- Lecturing (45 minutes): Jeroen van der Sluijs
- Knowledge quality assessment
- Problem framing and context
- Indicators
- Intro: approaches to uncertainty management

- Coffee break (10 minutes)
- Discussion (30 minutes)
- Comfort break (5 minutes)

- Lecturing (45 minutes)
- Quantitative approaches by Andrea Saltelli
- Qualitative approaches (JvdS)
- Procedural approaches (JvdS)

- Comfort break (5 minutes)
- Discussion (40 minutes)

- Presentation (15 minutes) (JvdS/AP/MKvK)
- Intro to case study
- Intro to assignment

- Facilitated group work
- 12.00-13.00 Lunch
- 13.00-14.20 Session 5 Continued
- Presentation & discussion of results (60 minutes)
- Presentation of assignment for interim period (15 minutes)

- Coffee break (10 minutes)

- Presentation (40 minutes) (JvdS)
- Why participation?
- Introduction to approaches.

- Discussion (40 minutes)
- 15.30-16.00 Discussion and evaluation of course