1 / 16

Why it is good to be uncertain ?

Why it is good to be uncertain ?. Martin Wattenbach, Pia Gottschalk, Markus Reichstein, Dario Papale, Jagadeesh Yeluripati, Astley Hastings, Marcel van Oijen, Pete Smith Members of JUTF. outline. Uncertainty - the big unknown Some general thoughts about sources of uncertainty

ashley
Download Presentation

Why it is good to be uncertain ?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Why it is good to be uncertain ? Martin Wattenbach, Pia Gottschalk, Markus Reichstein, Dario Papale, Jagadeesh Yeluripati, Astley Hastings, Marcel van Oijen, Pete Smith Members of JUTF

  2. outline • Uncertainty - the big unknown • Some general thoughts about sources of uncertainty • Uncertainty in CarboEurope • Measurement uncertainty • Model uncertainty • JUTF activities • Where do we go next ?

  3. Uncertainty - the big unknown “We demand rigidly defined areas of doubt and uncertainty!” Douglas Adams

  4. Uncertainty the big unknown • Uncertainty is a measure of the lack of confidence we have in our experimental or modelling results after we have corrected for any known error • We can not measure it; we can only estimate its range of probability • Consequently, uncertainty is not error

  5. Uncertainty - sources • Incomplete or imperfect observations • Incomplete conceptual frameworks • In accurate prescriptions of known processes • Chaos • Lack of predictability Source: IPCC Martin Manning and Michel Petit

  6. Uncertainty - sources scenario uncertainty - type D measurement model Scenario uncertainty baseline uncertainty - type C scientific judgement uncertainty – type B measured/statistical uncertainty - type A conceptual uncertainty - type E Ecosystem Model

  7. Measurement Uncertainty

  8. Model uncertainty Monte Carlo – multi model run DNDC* model Output distribution Input distribution The discrepancy between simulated mean value from the Monte Carlo runs and the annual value obtained from a single run using the best estimates. suggest that using the best estimate may not lead to the most probable model result. * DeNitrification-DeComposition model

  9. Model uncertainty – global uncertainty Gottschalk et al. 2007

  10. 2003 2002 2003 2002 Oensingen In Laqueuille In Oensingen Ex Laqueuille Ex Gottschalk et al. 2007 Model uncertainty - factor importance

  11. JUTF - Joint Uncertainty Task Force • Two projects – two shared aims: • CarboEurope : • CarboEurope-IP aims to understand and quantify the present terrestrial carbon balance of Europe and the associated uncertainty at local, regional and continental scale. • NitroEurope: • an observing system for N fluxes and pools [Component 1] • a network of manipulation experiments [Component 2] • plot-scale C-N modelling [Component 3] • landscape analysis [Component 4] • European up-scaling [Component 5] and • uncertainty and verification of European estimates [Component 6] • Joint efforts = JUTF

  12. JUTF key activities • Workshop in spring 2007 that brought together people from both projects. We (CEU) learned about: • The NEU protocols for good-modelling practice and for uncertainty quantification and analysis • Marcel von Oijen approach of Bayesian calibration and model comparison as one of the key features of the uncertainty analysis methods • Agreement to have a joint model comparison exercise across scales using the Bayesian approach • Implementation of the up-scaling approach used in NEU as one method for CEU croplands up-scaling

  13. model Bayesian calibration data model model Where do we go next Prior pdf Posterior pdf

  14. Why do we go Bayesian ? • BC uses parameter pdf’s instead of best estimates • Takes into account data pdf’s • Use Bayes’ Theorem to calculate posterior parameter pdf • Use Bayes’ Theorem to quantify the plausibility of different models • It will reduce the uncertainty in our model results in the case the model represents the system correctly • Do all future model runs with samples from the parameter pdf (i.e. quantify uncertainty of model results) BC can use data to reduce parameter uncertainty for any process-based model

  15. summary • Uncertainty is already a key feature in CEU measurement and modelling • However, implementation of NEU protocols are a useful addition to already existing methods in CarboEurope • Only the Bayesian approach can not only quantify but also reduce uncertainty in model parameters even with limited information available • Why it is good to be uncertain ? • bad communication of uncertainties leads to misinterpretation, misunderstanding and finally to wrong decisions (e.g. Harwood and Stokes 2003) • Only “rigidly defined areas of doubt and uncertainty” will prevent this

  16. Thank you for your attention “There is a theory which states that if ever for any reason anyone discovers what exactly the Universe is for and why it is here it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another that states that this has already happened.” Douglas Adams

More Related