1 / 24

John W. Green, Ph.D., Ph.D. Senior Consultant: Biostatistics DuPont Applied Statistics Group

Probabilistic Risk Assessment in Environmental Toxicology RISK: Perception, Policy & Practice Workshop October 3-4, 2007 SAMSI, Research Triangle Park, NC. John W. Green, Ph.D., Ph.D. Senior Consultant: Biostatistics DuPont Applied Statistics Group.

terry
Download Presentation

John W. Green, Ph.D., Ph.D. Senior Consultant: Biostatistics DuPont Applied Statistics Group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic Risk Assessmentin Environmental ToxicologyRISK: Perception, Policy & Practice Workshop October 3-4, 2007SAMSI, Research Triangle Park, NC John W. Green, Ph.D., Ph.D. Senior Consultant: Biostatistics DuPont Applied Statistics Group

  2. Topics Addressed in Environmental Risk Assessment • Present & proposed regulatory methods • Concerns • Micro- vs macro-assessments • Variability vs Uncertainty • Exposure and Toxicity • Exposure models (Monte Carlo, PBA) • extensive literature on exposure • Toxicity • Species Sensitivity Distributions (Monte Carlo) • Combining the two for risk assessment

  3. Deterministic Probabilistic Toxicity Exposure TER

  4. Assessment of Toxicity • Species level assessments • Laboratory toxicity experiments • Greenhouse studies • Field studies • Ecosystem level assessment • Most sensitive species • Mesocosm studies • Species Sensitivity Distribution

  5. Species Level Assessment:NOEC (aka NOAEL) and ECx • LOEC = lowest tested conc at which a statistically significant adverse effect is observed • NOEC = highest tested conc < LOEC • LOEC, NOEC depend on experimental design & statistical test • ECx = conc producing x% effect • ECx depends on experimental design and model and choice of x

  6. Ecosystem level assessment Current Method • Determine the NOEC (or EC50) for each species representing an ecosystem • Find the smallest NOEC (or EC50) • Divide it by 10, 100, or 1000 (uncertainty factor) • Regulate from this value • or argue against it

  7. Ecosystem level assessmentProbabilistic Approach • Collect a consistent measure of toxicity from a representative set of species • EC50s or NOECs (not both) • Fit a distribution (SSD) to these numerical measures • Estimate concentration, HC5, that protects 95% of species in ecosystem • Advantages and problems with SSDs

  8. Selection of Toxicity Data SSD by Habitat Visual groupings are not taxonomic classes but defined by habitat , possibly related to mode of action

  9. How Many Species? • Newman’s method: 40 to 60 species • Snowball’s chance… • Might reduce this by good choice of groups to model • Aldenberg-Jaworski: 1 species will do • If you make enough assumptions,… • 8 is usual target • 5 is common • 20-25 in some non-target plant studies

  10. Which Distribution to Fit? • Normal, log-normal, log-logistic, Burr III…? • With 5-8 data points, selecting the “right” distribution is a challenge • Next slide gives simulation results • Does it matter? • Recent simulation study suggests yes • 2nd slide following: uniform [0,1] generated • Various distributions fit • Actual laboratory data suggests yes

  11. Power to Detect non-LognormalityExponential Distribution Generated

  12. Does it Matter?Q05 Simulations: True value =0.05Uniform [0,1] Generated

  13. Which Laboratory Species? One EUFRAM case study fits an SSD to the following Aquatic toxicologists can comment (and have)on whether these values belong to a meaningful population

  14. Variability and Uncertainty Uncertainty reflects lack of knowledge of the system under study Ex1: what distribution to fit for SSD Ex2: what mathematical model to use to estimate ECx Increased knowledge will reduce uncertainty Variability reflects lack of control inherentvariation or noise among individuals. Increased knowledge of the animal or plant species will not reduce variability

  15. Variability & Uncertainty • The fitted distribution is assumed log-normal • Defined by the population mean and variance • Motivatedin part by standard relationship shown below • Randomly sample from the χ2(n-1) distribution. • Then randomly sample from a normal with the above variance, and mean equal to sample mean • Note: If formulas below are used, only variability is captured

  16. Spaghetti plot Probabilities (vertical variable values) associated with a given value of log(EC50) are themselves distributed For a given log(EC50)value, the middle 95% of these secondary probabilities defines 95% confidence interval for proportion of species affected at that conc

  17. For a given proportion (value of y), the values of Log(EC50) (horizontal variable) that might have produced the given y-value are distributed. For a given y value, the middle 95% of these x-values defines 95% confidence bounds on the distribution of log(ECy) values.

  18. Summary Plot for SSD

  19. Putting it All Together Joint Probability Curves Plot exposure and toxicity distributions together to understand the likelihood of the exposure concentration exceeding the toxic threshold of a given percent of the population

  20. Calculating Risk The risk is given by Pr[Xe>Xs] where Xe = exposure, Xs =sensitivity or toxicity This is an “average” probability that exposure will exceed the sensitivity of species exposed Not clear that this captures the right risk Work needed here

  21. Conclusions • PRA can bring increased reality to risk management by • communicating uncertainty more realistically • separating uncertainty from variability • clarifying risk of environmental effects • PRA is only as good as the assumptions and theories on which it rests • The bad news is that implementation is running ahead of understanding

  22. Conclusions • SSDs based on tiny datasets unreliable • Need to identify what populations are appropriate subjects for SSD is vital • 2-D Monte Carlo methods often assume independent inputs or specific correlations • Not realistic inmanycases • PBA can accommodate dependent inputs • But can lead to wide bounds • Have other limitations restricting use • MCMC can accommodate correlated inputs • But are mathematically demanding

More Related