Uncertainty and quality in scientific policy assessment introductory remarks
This presentation is the property of its rightful owner.
Sponsored Links
1 / 23

Uncertainty and quality in scientific policy assessment -introductory remarks- PowerPoint PPT Presentation


  • 72 Views
  • Uploaded on
  • Presentation posted in: General

Integrated Assessment of Health Risks of Environmental Stressors in Europe. INTA RESE. Uncertainty and quality in scientific policy assessment -introductory remarks-. Martin Krayer von Krauss, WHO/EEA. Introductions. Intarese contingent National public institutes Academia

Download Presentation

Uncertainty and quality in scientific policy assessment -introductory remarks-

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Uncertainty and quality in scientific policy assessment introductory remarks

  • Integrated Assessment of Health Risks of Environmental Stressors in Europe

INTARESE

Uncertainty and qualityin scientific policy assessment-introductory remarks-

Martin Krayer von Krauss, WHO/EEA


Introductions

Introductions

  • Intarese contingent

    • National public institutes

    • Academia

    • Other: WHO, CEFIC, etc…

  • EEA contingent

  • Others

  • Marco Martuzzi & David Gee

  • Jeroen van der Sluijs & Arthur Petersen

  • Jerry Ravetz and Andrea Saltelli

  • Jacqueline McGlade


Context

Context

Science with a twist!


Context1

Context

  • Stakes are high;

  • Values are in dispute;

  • Facts are uncertain.

  • In a situation where very few of our assessments can truly be “validated”, and where the consequences of error could be far reaching, how do we ensure the quality of our work?


Workshop objective

Workshop Objective

Provide you with an understanding of:

  • The context within which science for policy is conducted;

  • The rational for the interest in uncertainty and quality in policy assessments;

  • Qualitative as well as quantitative conceptions of uncertainty;

  • Qualitative and quantitative approaches to managing uncertainty and quality;

  • The relationship between uncertainty, quality and stakeholder participation.


Dynamic learning

Dynamic learning

  • Lecturing

  • Discussions

  • Exercise

  • Workshop dinner!


Setting the scene the rivm credibility crisis of 1999 and the subsequent response

Setting the scene: The RIVM credibility crisis of 1999 and the subsequent response

  • Chair: Gordon McInnes, Deputy Director, EEA

  • Presentations:

    • Arthur Petersen, Director of the Methodology and Modelling Programme, MNP

    • Jeannette Beck, Program Leader on Air Quality, MNP

    • Jan Wijmenga, Netherlands Ministry of Housing, Spatial Planning and the Environment

  • Discussion: Quality and uncertainty management needs in science for policy

    • Sigfus Bjarnason, Head of group, EEA


Session 2 the context 13 00 14 00

Session 2: the context 13.00-14.00

  • Lecturing (30 minutes)

    • Introduction by Jerry Ravetz

    • Reflexivity and the Reflexive Practitioner (AP)

    • Levels of uncertainty (mkvk)

  • Discussion (30 minutes)


Uncertainty a 3 dimensional concept

Location

Nature

Level

Uncertainty: a 3 dimensional concept

See: Walker et al., (2003).

J. of Integrated Assessment, 4 (1): 5-17.


Location of uncertainty 1 st dimension

Location of Uncertainty(1st dimension)

  • Refers to the location at which the uncertainty manifests itself in the model

  • Examples of common assessment models:

    • Risk Assessment:

      Risk = Probability x Consequence

    • Environmental & Health assessment of chemicals:

      Risk = Exposure x Effect

  • Generic model locations:

    • Context (e.g. boundaries, framing)

    • Input data

    • Model uncertainty

    • Calibration data

    • Parameter uncertainty

    • Model output (conclusion)


Level of uncertainty 2 nd dimension

Location

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Nature

Level

Level of Uncertainty (2nd dimension)

e.g. see Knight, 1921; Smithson, 1988; Funtowicz and Ravetz, 1990; Faber et al., 1992; Wynne, 1992; Schneider & Turner, 1994; ESTO, 2001.


Statistical uncertainty

 = .5

 = 0

Statistical Uncertainty

  • There exist solid grounds for the assignment of a discrete probability to each of a well-defined set of outcomes.

  • We have a well known functional relationship

  • We have an adequate combination of: (i) number of parameters and (ii) amount and character of data.


Scenario uncertainty

?

?

?

2

-1

0.2

0.5

0.8

1.1

1.4

1.7

-1.9

-1.6

-1.3

-0.7

-0.4

-0.1

? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?

Scenario Uncertainty

  • We can describe a set of outcomes to be expected, but we cannot associate probabilities very well.

  • Assumptions; various plausible scenarios; unverified “what if ?” questions;

  • Ambiguous results.


Example of scenario uncertainty antibiotics in animal feedstuff

Example of scenario uncertainty:Antibiotics in animal feedstuff

Scenario: resistance to antibiotics

  • The widespread use of the antibiotics could lead to the development of resistant bacterial strains;

  • In the long run, antibiotics would no longer be effective in the treatment of disease in humans.

  • Scientific evidence: the development of bacterial resistance can take place.

  • How quickly and to what extent will it become a problem?

The outcome is clear,

the probability of it occurring is unknown.


Ignorance

Location

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Nature

Level

Ignorance

  • We do not know the essential functional relationships.

  • There exist neither grounds for the assignment of probabilities, nor even the basis for the definition of a complete set of outcomes.

  • More information may become known later through research, but little is known for the time being.

  • Recognized ignorance:

    We know that we don’t know!


Example of ignorance mad cow disease cjd

Example of ignorance: Mad cow disease & CJD

Consider the case of a scientist asked to assess the risks or the costs of BSE at the time of its discovery in 1986.

  • No historical data on BSE was available and scientific understanding of how the disease is contracted was limited.

  • The extent of the public outcry that would eventually occur remained unknown, as did the extent of the loss of exports and the drop in domestic demand that ensued.

  • Knowledge on the relationship between BSE and CJD would not become available for another 10 years.

    Any assessment would necessarily rely on a large number of assumptions, there would be no credible basis for the assignment of probabilities.

    There would not even be a credible basis to claim that all of the potential outcomes of the BSE epidemic had been thought of.


Uncertainty and quality in scientific policy assessment introductory remarks

Scenario

Uncertainty

Total

Ignorance

Recognised

Ignorance

Statistical

Uncertainty

Different levels of uncertainty call for different approaches to uncertainty assessment and management!


Session 3 basic concepts 14 00 15 30

Session 3: Basic concepts 14.00-15.30

  • Lecturing (45 minutes): Jeroen van der Sluijs

    • Knowledge quality assessment

    • Problem framing and context

    • Indicators

    • Intro: approaches to uncertainty management

  • Coffee break (10 minutes)

  • Discussion (30 minutes)

  • Comfort break (5 minutes)


Session 4 approaches to managing uncertainty 15 30 17 00

Session 4: Approaches to managing uncertainty 15.30-17.00

  • Lecturing (45 minutes)

    • Quantitative approaches by Andrea Saltelli

    • Qualitative approaches (JvdS)

    • Procedural approaches (JvdS)

  • Comfort break (5 minutes)

  • Discussion (40 minutes)


Session 5 exercise 09 30 14 20

Session 5: Exercise09.30-14.20

  • Presentation (15 minutes) (JvdS/AP/MKvK)

    • Intro to case study

    • Intro to assignment

  • Facilitated group work

  • 12.00-13.00 Lunch

  • 13.00-14.20 Session 5 Continued

    • Presentation & discussion of results (60 minutes)

    • Presentation of assignment for interim period (15 minutes)

  • Coffee break (10 minutes)


Session 6 stakeholder participation 14 30 15 30

Session 6: stakeholder participation 14.30-15.30

  • Presentation (40 minutes) (JvdS)

    • Why participation?

    • Introduction to approaches.

  • Discussion (40 minutes)

  • 15.30-16.00 Discussion and evaluation of course


  • Login