1 / 33

Uncertainty analysis – combining quantitative & qualitative assessments

Uncertainty analysis – combining quantitative & qualitative assessments. Andy Hart, Food and Environment Research Agency, UK. andy.hart@fera.gsi.gov.uk. JIFSAN, June 2011. Origins and acknowledgements. Earlier Fera projects on probabilistic modelling

yates
Download Presentation

Uncertainty analysis – combining quantitative & qualitative assessments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Uncertainty analysis – combining quantitative & qualitative assessments Andy Hart, Food and Environment Research Agency, UK andy.hart@fera.gsi.gov.uk JIFSAN, June 2011

  2. Origins and acknowledgements Earlier Fera projects on probabilistic modelling EFSA & IPCS/WHO exposure uncertainty work groups Review of EU scientific committee opinions UK research project for Food Standards Agency: • JP Gosling, Peter Craig, Alan Boobis, David Coggon, David Jones ‘Transatlantic dialogue project’: • EU: Jim Bridges, Peter Calow, Jan Linders, Wolfgang Dekant, Theo Vermeire, David Gee, Tony Hardy, Vladimir Garkov, Takis Daskaleros • US: Vicki Dellarco, Linda Abbott, Nancy Beck, Marcus Aguilar • Canada: Hans Yu, Titus Tao, Scott Jordan, Myriam Hill, Peter Chan, Ed Black, Laurent Gemar, Taylor LaCombe, John Giraldez ILSI Expert Group on data selection for BMD modelling

  3. Outline • Requirements for addressing uncertainty • Quantitative and qualitative approaches • Uncertainty tables for unquantified uncertainties • Combining qualitative and quantitative assessments

  4. Characterising uncertainty ‘Expression of uncertainty or variability in risk estimates may be qualitative or quantitative, but should be quantified to the extent that is scientifically achievable.’ Codex Working Principles for Risk Analysis

  5. Practical considerations: • It is never practical to quantify all uncertainty • Quantifying one uncertainty often introduces others • Some uncertainties may be too ‘deep’ to quantify • Time and resources are often limiting (incl. crises) • There are cultural barriers to quantification • Simple analysis is often sufficient

  6. So we need… • A flexible strategy for uncertainty analysis that can adapt to a range of contexts and requirements • Methods for deciding which uncertainties to quantify • Appropriate quantitative methods • Methods to account for unquantified uncertainties • Ways to characterise overall uncertainty – both quantified and unquantified

  7. Existing approaches • Apart from default uncertainty factors, uncertainty receives no explicit attention in most assessments • Usually it is indicated only by use of ambiguous qualifiers in risk statements: “appear… approximately… believe… cannot be excluded… estimated… expected… in general… indicate… may… perhaps… possible… potential… probably… reasonable… suggest… suspected… theoretically… uncertain… variable…” Review of 100 EU scientific committee opinions, 2007

  8. Existing approaches • Wide range of quantitative approaches • Deterministic, interval, probabilistic, etc. • Qualitative/semi-quantitative approaches include: • Uncertainty tables (e.g. EFSA, REACH, US NRC, Health Canada, Walker et al.) • Weight of evidence procedures (e.g. IARC) • Evidence maps (Wiedemann et al.) • Subjective probabilities (IPCC, Morgan et al., Neutra) • Pedigree analysis (van der Sluijs et al., IPCS/WHO) • Social/participatory appraisal (Stirling, Wynne etc.)

  9. Minimum requirements for addressing uncertainty • Systematically identify and document sources of uncertainty affecting the assessment • Evaluate their combined effect on the outcome • Identify and characterise any deep uncertainties • Communicate with risk managers & stakeholders

  10. A flexible strategy A simple starting point… Uncertainty tables if needed… … if needed Methods for ‘deep’ uncertainties Quantitative methods

  11. Two types of assessment question  two types of uncertainty table Quantitative questions (e.g. benchmark dose) • Calculation, measure or estimate; quantitative scale • Express uncertainty in terms of how different the true value could be Categorical questions (e.g. relevance of effect to humans) • Weight of evidence; yes/no scale • Express uncertainty in terms of the probability of alternative outcomes ? 10 0 mg/kg/day ? No Yes 1 0 probability

  12. Uncertainty table for quantitative questions Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  13. Uncertainty table for quantitative questions Specify in precise terms the quantity being assessed Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  14. Uncertainty table for quantitative questions List uncertainties affecting the assessment Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  15. Uncertainty table for quantitative questions Define a scale for estimating impact of uncertainties Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  16. Uncertainty table for quantitative questions Estimate the impact of each uncertainty on how different the ‘true’ value might be (‘?’=cannot evaluate) Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  17. Uncertainty table for quantitative questions Estimate the combined impact of all the uncertainties (judgment not calculation) Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  18. Uncertainty table for quantitative questions Add a narrative description of overall uncertainty for use in assessment summary Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  19. Uncertainty table for quantitative questions …and describe any uncertainties that cannot be evaluated Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  20. Uncertainty table for quantitative questions Options when more refined assessment is needed: Quantify the most important uncertainties, deterministically or probabilistically (retain tabular approach for unquantified uncertainties) Use formal expert elicitation to estimate the individual and/or combined impacts Use a quantitative model to combine the impacts Scale: * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  21. Two types of assessment question  two types of uncertainty table Quantitative questions (e.g. benchmark dose) • Calculation, measure or estimate; quantitative scale • Express uncertainty in terms of how different the true value could be Categorical questions (e.g. relevance of effect to humans) • Weight of evidence; yes/no scale • Express uncertainty in terms of the probability of alternative outcomes ? 10 0 mg/kg/day ? No Yes 1 0 probability

  22. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  23. Uncertainty table for categorical questions Specify in precise terms the question that is being addressed Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  24. Uncertainty table for categorical questions List & summarise lines of evidence relevant to the question Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  25. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) List strengths & uncertainties affecting each line of evidence No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  26. Uncertainty table for categorical questions Define a scale for expressing influence of evidence on conclusion Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  27. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) Evaluate influenceof each line of evidence, taking account of strengths & uncertainties No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  28. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) Considering all lines of evidence, evaluate the probability that the answer to the question is positive, and express numerically and/or verbally No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  29. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) No 0 Yes 100% …and describe any uncertainties that cannot be evaluated probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  30. Uncertainty table for categorical questions Key: Up arrows – tending to ‘yes’ Down arrows – tending to ‘no’ ↑↑↑ could be sufficient alone ↑↑ contributes importantly ↑ minor contribution ● no influence ? unable to evaluate (and similarly for ↓, ↓↓, ↓↓↓) Options when more refined assessment is needed: Break down the lines of evidence in more detail Elicit probability judgments using formal methods Consider quantitative modelling (e.g. belief nets) No 0 Yes 100% probability * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  31. Combining quantitative & qualitative assessment (1) Form an overall narrative characterisation E.g. The relevant Point of Departure for PhIP is 2-3x below the BMDL10 for prostate cancer * Draftcase study from ILSI-Europe WG on selection of data for benchmark dose modelling

  32. Combining quantitative & qualitative assessment (2) More sophisticated options: • Elicit a distribution for overall uncertainty, combining the quantified and unquantified uncertainties - or - • Replace the qualitative judgments with distributions and combine them mathematically with the quantified uncertainties …taking account of dependencies

  33. A flexible strategy Uncertainty tables: • Provide a ‘first tier’ uncertainty analysis • Help decide which uncertainties to quantify • Characterise unquantified uncertainties • Help identify ‘deep’ uncertainties • Contribute to overall characterisation of uncertainty Uncertainty tables Methods for ‘deep’ uncertainties Quantitative methods Overall characterisation of uncertainty

More Related