1 / 52

IBM Supercomputer European Centre for Medium-Range Weather Forecasts

The IPCC as parliament of things Dealing with uncertainty and value commitments in climate simulation. IBM Supercomputer European Centre for Medium-Range Weather Forecasts.

jeremy-good
Download Presentation

IBM Supercomputer European Centre for Medium-Range Weather Forecasts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The IPCC as parliament of thingsDealing with uncertainty and value commitments in climate simulation

  2. IBM SupercomputerEuropean Centre for Medium-Range Weather Forecasts

  3. IPCC 2001: taking into account all uncertainties (including model uncertainty): largest part of warming is ‘likely’ due to anthropogenic greenhouse gases Warning: take into account uncertainty in climate simulation

  4. IPCC 2007: taking into account all uncertainties (including model uncertainty): largest part of warming is ‘very likely’ due to anthropogenic greenhouse gases

  5. de Kwaadsteniet versus van Egmond • de Kwaadsteniet:“Computer simulations are seductive due to their perceived speed, clarity and consistency. However, simulation models are not rigorously compared with data.” • van Egmond:“Policy makers are confronted with incomplete knowledge; task of scientific advisers to report on the current state of knowledge, including uncertainties. Simulation models are indispensable.”

  6. Simulation in scientific practice • Definition of computer simulation:“mathematical model that is implemented on a computer and imitates real-world processes” • Functions of simulation: • technique (to investigate detailed dynamics) • heuristic tool (to develop hypotheses, theories) • substitute for an experiment • tool for experimentalists • pedagogical tool

  7. Central activities in simulation practice • Formulating the mathematical model(conceptual and mathematical model: ‘ideas’) • Preparing the model inputs(model inputs: ‘marks’) • Implementing and running the model(technical model implementation: ‘things’) • Processing the data and interpreting them(processed output data: ‘marks’)

  8. Four claims re. climate simulation • Different models give conflicting descriptions of the climate system. • There exists no unequivocal methodology for climate simulation. • The assumptions in climate simulations are value-laden. • Pluralism in climate modelling is an essential requirement both for ‘good’ science and for ‘appropriate’ science advising.

  9. Funtowicz and Ravetz, Science for the Post Normal age, Futures, 1993

  10. The challenge of post-normal science • Expert advisers should be reflexive • Methods for dealing with uncertainty should merely be considered as tools, not as the solutions • Fear for paralysis in policy making should not be allowed to block communication about uncertainty • Communication with a wider audience about uncertainties is crucial

  11. Shifting notions of reliability • Statistical reliability (expressed in terms of probability) • How do you statistically assess climate predictions? • Methodological reliability (expressed qualitatively in terms of weak/strong points) • How do you determine the methodological quality of the different elements in simulation practice, given the purpose of the model? • Public reliability (expressed in terms of public trust) • What determines public trust in modellers?

  12. Lesson learnt in uncertainty communication (I) Conditional character of probabilistic projections requires being clear on assumptions and potential consequences (e.g. robustness, things left out) Room for further development in probabilistic uncertainty projections: how to deal decently with model ensembles, accounting for model discrepancies There is a role to be played for knowledge quality assessment, as complementary to more quantitative uncertainty assessment 22 22

  13. Lessons learnt in uncertainty communication (II) Recognizing ignorance often more important than characterizing statistical uncertainty Communicate uncertainty in terms of societal/political risks 23 23

  14. A case of deep uncertainty: adaptation to changes in extreme weather in the Netherlands • Extreme weather events are predominantly associated with the risk of flooding, which is generally considered a government responsibility. • However, future projections for the Netherlands provide a picture which is somewhat more complex. The changes require awareness among society at large. • Yet, individuals and economic sectors have already dealt with the weather for ages and have developed knowledge and behavioural responses with respect to weather extremes.

  15. Notions from the policy sciences and social psychology • Articulation(views with respect to extreme weather are not always coherent) • Information(access to information shapes articulation) • Differentiation(different perspectives lead to different assessments of risk and potentials) • Learning by interaction(stakeholders and scientists can learn from interacting, by which preferences for, possibly new, policy options evolve)

  16. “Two cold winters don't deny global warming” • Dutch winter 2009-2010 coldest since 1996 • Questions one may ask: • How 'extreme' was this? • Will this happen less (or more...) often in the future? • Does this fit in the 'Global Warming'-picture? • How to optimally adapt to changes in extremes?

  17. Eleven city marathon • Marathon has been organized 15 times in the period 1901-2008, in the province of Friesland • How has the chance for holding a marathon changed over the past century? • How will it change in the future?

  18. Eleven city marathon Projections for 2050 for four scenarios: once every 18, 29, 55 or 183 years

  19. What is the impact of weather extremes, how can we adapt?

  20. Bridging the gap between science and policy • Uncertainties with respect to climate change and extreme weather events; knowledge about future is based on models • Need for adaptive governanceand for methodology to assesspolicy options with different,even conflicting, outcomes • Need for indicators ofoutcomes for evaluatingpolicy options relevant forstakeholders and reliablefor scientists

  21. Bridging the gap: selected approach Two postdoctoral studies: • Social scientist: engaging with the stakeholders; analysing the process; co-developing adaptation options • Statistician/climate scientist: studying the uncertainty range of climate projections and decadal predictions of weather extremes; co-developing indicators Team of political scientists,statisticians, climate modellers,social scientists

  22. Global climate models and regional embedded models Global model Regional model

  23. Source: E. Hawkins & R. Sutton, Bull. of Amer. Meteor. Soc., aug. 2009, 1097-1107 Different sources of uncertainty - Global

  24. Source: E. Hawkins & R. Sutton, Bull. of Amer. Meteor. Soc., aug. 2009, 1097-1107 Different sources of uncertainty - Regional

  25. Example from the Intergovernmental Panel on Climate Change WG I (2007) “Most of the observed increase in globally averaged temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations12.” (SPM) 12Consideration of remaining uncertainty is based on current methodologies.

  26. Example from the IPCC WG I 2007 (continued) “Very likely” means a chance >90%. But what kind of probability are we dealing with here? “assessed likelihoodof an outcome or a result” “assessed likelihood, using expert judgement, of an outcome or a result” Draft SPM Final SPM

  27. Importance of identifying high-confidence findings

  28. Process: Openness, peer review, supervision • Openness: PBL registration website for possible errors • 40 reactions in total; 3 of which relevant for our investigation • Draw on IPCC authors to give feedback • Internal and external peer review • Independent supervision by KNAW Royal Netherlands Academy of Arts and Sciences

  29. Quite some risk for losing uncertainty information

  30. What can go wrong? • E1 Inaccurate statement • E1a Errors that can be corrected by an erratum (5) • E1b Errors that require a redoing of the assessment of the issue at hand (2) • E2 Inaccurate referencing (3) • C1 Insufficiently substantiated attribution (1) • C2 Insufficiently founded generalization (2) • C3 Insufficiently transparent expert judgment (10) • C4 Inconsistency of messages (2) • C5 Untraceable reference (3) • C6 Unnecessary reliance on grey referencing (2) • C7 Statement unavailable for review (1)

  31. Errors and shortcomings in AR4 WG II (8 chapt.)

More Related