slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep PowerPoint Presentation
Download Presentation
Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep

Loading in 2 Seconds...

play fullscreen
1 / 111

Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep - PowerPoint PPT Presentation


  • 155 Views
  • Uploaded on

Methods and Applications of Uncertainty and Sensitivity Analysis. H. Christopher Frey, Ph.D. Professor. Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prepared for: Workshop on Climate Change Washington, DC

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Department of Civil, Construction, and Environmental Engineering North Carolina State University Raleigh, NC 27695 Prep' - topper


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Methods and Applications of Uncertainty and Sensitivity Analysis

H. Christopher Frey, Ph.D.

Professor

Department of Civil, Construction, and Environmental Engineering

North Carolina State University

Raleigh, NC 27695

Prepared for:

Workshop on Climate Change

Washington, DC

March 7, 2005

outline
Outline
  • Why are uncertainty and sensitivity analysis needed?
  • Overview of methods for uncertainty analysis
    • Model inputs
      • Empirical data
      • Expert judgment
    • Model uncertainty
    • Scenario uncertainty
  • Overview of methods for sensitivity analysis
  • Examples
    • Technology assessment
    • Emissions Factors and Inventories
    • Air Quality Modeling
    • Risk Assessment
  • Findings
  • Recommendations
why are uncertainty and sensitivity analysis needed
Why are uncertainty and sensitivity analysis needed?
  • Strategies for answering this question:
    • what happens when we ignore uncertainty and sensitivity?
    • what do decision makers want to know that motivates doing uncertainty and sensitivity analysis?
    • what constitutes best scientific practice?
  • Program and research managers may not care about all three, but might find at least one to be convincing (and useful)
when is probabilistic analysis needed or useful
When is ProbabilisticAnalysis Needed or Useful?
  • Consequences of poor or biased estimates are unacceptably high
  • A (usually conservative) screening level analysis indicates a potential concern, but carries a level of uncertainty
  • Determining the value of collecting additional information
  • Uncertainty stems from multiple sources
  • Significant equity issues are associated with variability
  • Ranking or prioritizing significance of multiple pathways, pollutants, sites, etc.
  • Cost of remediation or intervention is high
  • Scientific credibility is important
  • Obligation to indicate what is known and how well it is known
when is a probabilistic approach not needed
When is a Probabilistic Approach Not Needed?
  • When a (usually conservative) screening level analysis indicates a negligible problem
  • When the cost of intervention is smaller than the cost of analysis
  • When safety is an urgent and/or obvious issue
  • When there is little variability or uncertainty
myths barriers to use of methods
Myths: Barriers to Use of Methods
  • Myth: it takes more resources to do uncertainty analysis, we have deadlines, we don’t know what to do with it, let’s just go with what we have…
  • Hypothesis 1: poorly informed decisions based upon misleading deterministic/point estimates can be very costly, leading to a longer term and larger resource allocation to correct mistakes that could have been avoided or to find better solutions
  • Hypothesis 2: Uncertainty analysis helps to determine when a robust decision can be made versus when more information is needed first
  • Hypothesis 3: Uncertainty and sensitivity analysis help identify key weaknesses and focus limited resources to help improve estimates
  • Hypothesis 4: Doing uncertainty analysis actually reduces overall resource requirements, especially if it is integrated into the process of model development and applications
role of modeling in decision making
Role of Modeling in Decision-Making
  • Modeling should provide insight
  • Modeling should help inform a decision
  • Modeling should be in response to clearly defined objectives that are relevant to a decision.
questions that decision makers and stakeholders typically ask
Questions that Decision-Makers and Stakeholders Typically Ask
  • How well do we know these numbers?
    • What is the precision of the estimates?
    • Is there a systematic error (bias) in the estimates?
    • Are the estimates based upon measurements, modeling, or expert judgment?
  • How significant are differences between two alternatives?
  • How significant are apparent trends over time?
  • How effective are proposed control or management strategies?
  • What is the key source of uncertainty in these numbers?
  • How can uncertainty be reduced?
application of uncertainty to decision making
Application of Uncertainty to Decision Making
  • Risk preference
    • Risk averse
    • Risk neutral
    • Risk seeking
  • Utility theory
  • Benefits of quantifying uncertainty: Expected Value of Including Uncertainty
  • Benefits of reducing uncertainty: Expected Value of Perfect Information (and others)
variability and uncertainty
Variability and Uncertainty
  • Variability: refers to the certainty that
    • different members of a population will have different values (inter-individual variability)
    • values will vary over time for a given member of the population (intra-individual variability)
  • Uncertainty: refers to lack of knowledge regarding
    • True value of a fixed but unknown quantity
    • True population distribution for variability
  • Both depend on averaging time
variability and uncertainty1
Variability and Uncertainty
  • Sources of Variability
    • Stochasticity
    • Periodicity, seasonality
    • Mixtures of subpopulations
    • Variation that could be explained with better models
    • Variation that could be reduced through control measures
variability and uncertainty2
Variability and Uncertainty
  • Sources of Uncertainty:
    • Random sampling error for a random sample of data
    • Measurement errors
      • Systematic error (bias, lack of accuracy)
      • Random error (imprecision)
    • Non-representativeness
      • Not a random sample, leading to bias in mean (e.g., only measured loads not typical of daily operations)
      • Direct monitoring versus infrequent sampling versus estimation, averaging time
      • Omissions
    • Surrogate data (analogies with similar sources)
    • Lack of relevant data
    • Problem and scenario specification
    • Modeling
overview of state of the science
Overview of “State of the Science”
  • Statistical Methods Based Upon Empirical Data
  • Statistical Methods Based Upon Judgment
  • Other Quantitative Methods
  • Qualitative Methods
  • Sensitivity Analysis
  • Scenario Uncertainty
  • Model Uncertainty
  • Communication
  • Decision Analysis
statistical methods based upon empirical data
Statistical MethodsBased Upon Empirical Data
  • Frequentist, classical
  • Statistical inference from sample data
    • Parametric approaches
      • Parameter estimation
      • Goodness-of-fit
    • Nonparametric approaches
    • Mixture distributions
    • Censored data
    • Dependencies, correlations, deconvolution
statistical methods based upon empirical data1
Statistical MethodsBased Upon Empirical Data
  • Variability and Uncertainty
    • Sampling distributions for parameters
    • Analytical solutions
    • Bootstrap simulation
propagating variability and uncertainty
Propagating Variability and Uncertainty
  • Analytical techniques
    • Exact solutions (limited applicability)
    • Approximate solutions
  • Numerical methods
    • Monte Carlo
    • Latin Hypercube Sampling
    • Other sampling methods (e.g., Hammersley, Importance, stochastic response surface method, Fourier Amplitude Sensitivity Test, Sobol’s method, Quasi-Monte Carlo methods, etc.)
monte carlo simulation
Monte Carlo Simulation
  • Probabilistic approaches are widely used
  • Monte Carlo (and similar types of) simulation are widely used.
  • Why?
    • Extremely flexible
      • Inputs
      • Models
    • Relatively straightforward to conceptualize
tiered approach to analysis
Tiered Approach to Analysis
  • Purpose of Analyses (examples)
    • Screening to prioritize resources
    • Regulatory decision-making
    • Research planning
  • Types of Analyses
    • Screening level point-estimates
    • Sensitivity Analysis
    • One-Dimensional Probabilistic Analysis
    • Two-Dimensional Probabilistic Analysis
    • Non-probabilistic approaches
methods based upon expert judgment
MethodsBased Upon Expert Judgment
  • Expert Elicitation
    • Heuristics and Biases
      • Availability
      • Anchoring and Adjustment
      • Representativeness
      • Others (e.g., Motivational, Expert, etc.)
    • Elicitation Protocols
      • Motivating the expert
      • Structuring
      • Conditioning
      • Encoding
      • Verification
    • Documentation
    • Individuals and Groups
    • When Experts Diasagree
key ongoing challenges
Key Ongoing Challenges
  • Expert Judgment vs. Data
    • Perception that judgment is more biased than analysis of available data
    • Unless data are exactly representative, they too could be biased
    • Statistical methods are “objective” in that the results can be reproduced by others, but this does not guarantee absence of bias
    • A key area for moving forward is to agree on conditions under which expert judgment is an acceptable basis for subjective probability distributions, even for rulemaking situations
appropriate use of expert judgment in regulatory decision making
Appropriate Use of Expert Judgment in Regulatory Decision Making
  • There are examples…e.g.,
    • analysis of health effects for EPA standards
    • Uncertainty in benefit/cost analysis (EPA, OMB)
    • Probabilistic risk analysis of nuclear facilities
  • Key components of credible use of expert judgment:
    • Follow a clear and appropriate protocol for selecting experts and for elicitation
    • For the conditioning step, consider obtaining input via workshop, but for encoding, work individually with experts – preferably at their location
    • Document (explain) the basis for each judgment
    • Compare judgments: identify key similarities and differences
    • Evaluate the implications of apparent differences with respect to decision objectives – do not “combine” judgments without first doing this
    • Where possible, allow for iteration
statistical methods based upon expert judgment
Statistical MethodsBased Upon Expert Judgment
  • Bayesian methods can incorporate expert judgment
    • Prior distribution
    • Update with data using likelihood function and Bayes’ Theorem
    • Create a posterior distribution
  • Bayesian methods can also deal with various complex situations:
    • Conditional probabilities (dependencies)
    • Combining information from multiple sources
  • Appears to be very flexible
  • Computationally, can be very complex
  • Complexity is a barrier to more widespread use
other quantitative methods
Other Quantitative Methods
  • Interval Methods
    • Simple intervals
    • Probability bounds
    • Produce “optimally” narrow bounds – cannot be any narrower and still enclose all possible outcomes, including dependencies among inputs
    • Bounds can be very wide in comparison to confidence intervals
other quantitative methods1
Other Quantitative Methods
  • Fuzzy methods
    • Representation of vagueness, rather than uncertainty
    • Approximate/semi-quantitative
    • Has been applied in many fields
  • Meta-analysis
    • Quantitatively combine, synthesize, and summarize data and results from different sources
    • Requires assessment of homogeneity among studies prior to combining
    • Produces data with larger sample sizes than the constituent inputs
    • Can be applied to summary data
    • If raw data are available, other methods may be preferred
scenario uncertainty
Scenario Uncertainty
  • A need for formal methods
  • Creativity, brainstorming, imagination
  • Key dimensions (e.g., human exposure assessment)
    • Pollutants
    • Transport pathways
    • Exposure routes
    • Susceptible populations
    • Averaging time
    • Geographic extent
    • Time Periods
    • Activity Patterns
  • Which dimensions/combinations matter, which ones don’t?
  • Uncertainty associated with mis-specification of a scenario – systematic error
  • Scenario definition should be considered when developing and applying models
model uncertainty
Model Uncertainty
  • Model Boundaries (related to scenario)
  • Simplifications
    • Aggregation
    • Exclusion
  • Resolution
  • Structure
  • Calibration
  • Validation, Partial validation
  • Extrapolation
model uncertainty1
Model Uncertainty
  • Methods for Dealing with Model Uncertainty
    • Compare alternative models, but do not combine
    • Weight predictions of alternative models (e.g., probability trees)
    • Meta-models that degenerate into alternative models (e.g., Y = a(|x-t|)n , where n determines linear/nonlinear and t determines threshold or not)
weighting vs averaging
Weighting vs. Averaging

Each Model has Equal Weight

Model B

Model A

Probability Density

Output of Interest

Average of

Both Models

Neither Model

Supports This

Range of Outcomes

Probability Density

Output of Interest

sensitivity analysis
Sensitivity Analysis
  • Objectives of Sensitivity Analysis (examples):
    • Help identify key sources of variability (to aid management strategy)
      • Critical control points?
      • Critical limits?
    • Help identify key sources of uncertainty (to prioritize additional data collection to reduce uncertainty)
    • What causes worst/best outcomes?
    • Evaluate model behavior to assist verification/validation
    • To assist in process of model development
  • Local vs. Global Sensitivity Analysis
  • Model Dependent vs. Model Independent Sensitivity Analysis
  • Applicability of methods often depends upon characteristics of a model (e.g., nonlinear, thresholds, categorical inputs, etc.)
examples of sensitivity analysis methods

Nominal Range

Sensitivity Analysis (NRSA)

Differential Sensitivity Analysis

Regression Analysis (RA)

Analysis of Variance (ANOVA)

Classification and Regression

Trees (CART)

Scatter Plots

Conditional

Sensitivity

Examples of Sensitivity Analysis Methods
  • Mathematical Methods
    • Assess sensitivity of a model output to the range of variation of an input.
  • Statistical Methods
    • Effect of variance in inputs on the output distribution.
  • Graphical Methods
    • Representation of sensitivity in the form of graphs, charts, or surfaces.
sensitivity analysis methods examples
Sensitivity Analysis Methods (Examples)
  • Nominal Range Sensitivity Analysis
  • Differential Sensitivity Analysis
  • Conditional Analysis
  • Correlation coefficients (sample, rank)
  • Linear regression (sample, rank, variety of basis functions possible)
  • Other regression methods
  • Analysis of Variance (ANOVA)
  • Categorical and Regression Trees (CART) (a.k.a. Hierarchical Tree-Based Regression)
  • Sobol’s method
  • Fourier Amplitude Sensitivity Test (FAST)
  • Mutual Information Index
  • Scatter Plots
sensitivity analysis displays summaries
Sensitivity Analysis: Displays/Summaries
  • Scatter plots
  • Line plots/conditional analyses
  • Radar plots
  • Distributions (for uncertainty or variability in sensitivity)
  • Summary statistics
  • Categorical and regression trees
  • Apportionment of variance
guidance on sensitivity analysis
Guidance on Sensitivity Analysis

Guidance for Practitioners, with a focus on food safety process risk models (Frey et al., 2004):

  • When to perform sensitivity analysis
  • Information needed depending upon objectives
  • Preparation of existing or new models
  • Defining the case study/scenarios
  • Selection of sensitivity analysis methods
  • Procedures for application of methods
  • Presentation and interpretion of results
example of guidance on selection of sensitivity analysis methods
Example of Guidance on Selection of Sensitivity Analysis Methods

Source: Frey et al., 2004, www.ce.ncsu.edu/risk/

communication
Communication
  • Case Studies (scenarios)
  • Graphical Methods
    • Influence Diagrams
    • Decision Tree
    • Others
  • Summary statistics/data
  • Evaluation of effectiveness of methods for communication (e.g., Bloom et al., 1993; Ibrekk and Morgan, 1987)
example case studies
Example Case Studies
  • Technology Assessment
  • Emission Factors and Inventories
  • Air Quality Modeling
  • Risk Assessment
role of technology assessment in regulatory processes examples
Role of Technology Assessment in Regulatory Processes (Examples)
  • Assessment of ability of technology to achieve desired regulatory or policy goals (emissions control, safety, efficiency, etc.)
  • Evaluation of regulatory alternatives (e.g., based on model cost estimates)
  • Regulatory Impact Analysis – assessment of costs
methodology for probabilistic technology assessment
Methodology for Probabilistic Technology Assessment
  • Process simulation of process technologies in probabilistic frameworks
    • Integrated Environmental Control Model (IECM) and derivatives
    • Probabilistic capability for ASPEN chemical process simulator
  • Quantification of uncertainty in model inputs
    • Statistical analysis
    • Elicitation of expert judgment
  • Monte Carlo simulation
  • Statistical methods for sensitivity analysis
  • Decision tree approach to comparing technologies and evaluating benefits of additional research
conceptual diagram of probabilistic modeling

Exhaust Gas

Blowdown

Boiler Feedwater

Raw

water

Boiler

Feedwater

Treatment

HRSG

& Steam

Cycle

Steam

Turbine

Return Water

Steam

Gasifier Steam

Shift &

Regeneration

Steam

Gas Turbine Exhaust

Cyclone

Cyclone

Gasification,

Particulate &

Ash Removal,

Fines Recycle

Hot Gas

Desulfur-

ization

Coal

Coal

Gas

Turbine

Coal

Handling

Raw

Syngas

Clean

Syngas

Ash

Gasifier Air

Ash

Fines

Fines

Sulfuric

Acid

Plant

Tailgas

Sulfuric Acid

Air

Air

Electricity

Conceptual Diagram of Probabilistic Modeling

Engineering Performance

and Cost Model of a

New Process Technology

Input

Uncertainties

Output

Uncertainties

Performance

Performance

Inputs

Emissions

Cost Inputs

Cost

example of a probabilistic comparison of technology options
Example of a Probabilistic Comparison of Technology Options

Uncertainty in the difference in cost between two technologies, taking into account correlations between them

example engineering study of coal gasification systems
Example: Engineering Study ofCoal-Gasification Systems
  • • DOE/METC Engineers
  • • Briefing Packets:
  • - Part 1: Uncertainty Analysis (9 pages)
  • - Part 2: Process Area Technical Background
  • - Lurgi Gasifier: 12 p., 16 ref.
  • - KRW Gasifier: 19 p., 25 ref.
  • - Desulfurization: 9 p., 19 ref.
  • - Gas Turbine: 23 p., 36 ref.
  • - Part 3: Questionnaire
  • Follow-Up
examples of the judgments of one expert
Examples of the Judgmentsof One Expert

• Fines Carryover

• Carbon Retention

• Air/Coal Ratio

do different judgments really matter
Do Different Judgments Really Matter?

• Specific Sources of Disagreement:

- Sorbent Loading

- Sorbent Attrition

• Qualitative Agreement in Several Cases

technology assessment findings 1
Technology Assessment:Findings (1)
  • Interactions among uncertain inputs, and nonlinearities in model, contribute to positive skewness in model output uncertainties
    • Uncertainties in inputs are often positively skewed (physical, non-negative quantities)
    • The mean value of a probabilistic estimate is often “worse” (lower performance, higher cost) than the “best guess” deterministic estimate, and the probability of “worse” outcomes is typically greater than 50 percent.
    • A system approach is needed to account for interactions among process areas
    • Deterministic analysis leads to apparent “cost growth” and “performance shortfall” because it does not account for simultaneous interactions among positively skewed inputs
    • Uncertainty analysis requires more thought pertaining to developing input assumptions, but provides more insight into potential sources of “cost growth” and “performance shortfall”
technology assessment findings 2
Technology Assessment:Findings (2)
  • A decision model provides a framework for evaluating judgments regarding the outcomes of additional research and prioritizing additional research
    • Able to quantify the probability of “pay-offs” as well as downside risks
    • Able to compare competing options under uncertainty and identify robust choices
    • Trade-offs when comparing technologies can be evaluated probabilistically (e.g., efficiency, emissions, and cost)
  • It is possible to combine approaches for quantifying uncertainty in one assessment, consistent with objectives
technology assessment findings 3
Technology Assessment:Findings (3)
  • Thinking about uncertainties leads to better understanding of what matters most in the assessment
    • Often, only a relatively small number of inputs contribute substantially to uncertainty in a model output
    • Reducing uncertainty in only a few key inputs can substantially reduce downside risk and increase the pay-offs of new technology
    • Conversely, for those inputs to which the output is not sensitive, it is not critical to devote resources to refinement
  • When basing inputs on expert judgments, only those disagreements that really matter to the decision need become the focus of further discussion and evaluation
  • Bottom Line: Probabilistic analysis helps improve decisions and avoid unpleasant “surprises”.
emission factors and inventories
Emission Factors and Inventories
  • Significance to Regulatory Processes:
    • Assessment of capability of technology to reduce/prevent emissions
    • Evaluation of regulatory alternatives
    • Regulatory Impact Analysis – including benefit/cost analysis
    • Component of air quality management at various temporal and spatial scales
    • Component of human and ecological exposure and risk assessment
  • Modeling Aspects
    • Some emission factors are estimated using models
    • Emission Inventories are linear models
    • Specialized models for some emission factors and inventories (e.g., Mobile6, NONROAD, MOVES)
motivations for probabilistic emission factors and inventories
Motivations for Probabilistic Emission Factors and Inventories
  • How good are the estimates?
  • What are the key sources of uncertainty in the estimates that should be targeted for improvement?
  • Likelihood of meeting an emissions budget?
  • Which emission sources are the most significant?
  • What is the inter-unit variability in emissions?
  • What is the uncertainty in mean emissions for a group/fleet of sources?
  • What are the implications of uncertainty in emissions for air quality management, risk management of human exposures?
  • Consideration of geographic extent and averaging time
  • Estimation for future scenarios versus retrospective estimates of past emissions or assessment of current emissions
motivations for probabilistic analysis
Motivations for Probabilistic Analysis
  • “That a perfect assessment of uncertainty cannot be done, however, should not stop researchers from estimating the uncertainties that can be addressed quantitatively” (p. 150, NRC, 2000)
  • “EPA, along with other agencies and industries, should undertake the necessary measures to conduct quantitative uncertainty analyses of the mobile-source emissions models in the modeling toolkit.” (p. 166, NRC, 2000)
current practice for qualifying uncertainty in emission factors and inventories
Current Practice for Qualifying Uncertaintyin Emission Factors and Inventories
  • Qualitative ratings for emission factors (AP-42)
  • Data Attribute Rating System (DARS) (not really used in practice)
  • Both methods are qualitative
  • No quantitative interpretation
  • Some sources of uncertainty (i.e. non-representativeness) difficult to quantify
  • Qualitative methods can complement quantitative methods
statistical methodological approach
Statistical Methodological Approach
  • Compilation and evaluation of database
  • Visualization of data by developing empirical cumulative distribution functions
  • Fitting, evaluation, and selection of alternative probability distribution models
  • Characterization of uncertainty in the distributions for variability (e.g., uncertainty in the mean)
  • Propagation of uncertainty in activity and emissions factors to estimate uncertainty in total emissions
  • Calculation of importance of uncertainty
summary of approaches to emission factor and inventory uncertainty
Summary of Approaches to Emission Factor and Inventory Uncertainty
  • Probabilistic Methods
    • Empirical, Parametric
    • Mixture distributions
    • Censored distributions (non-detects)
    • Vector autoregressive time series (intra- and inter-unit correlation)
    • Bootstrap simulation
    • Expert Judgment
    • Monte Carlo simulation
    • Sensitivity analysis
  • Software tools:
    • AUVEE – Analysis of Uncertainty and Variability in Emissions Estimation
    • AuvTool – standalone software
summary of probabilistic emissions case studies at ncsu
Summary of Probabilistic Emissions Case Studies at NCSU
  • Case Studies (examples):
    • Point sources
      • Power Plants
      • Natural gas-fired engines (e.g., compressor stations)
    • Mobile sources
      • On-Road Highway Vehicles
      • Non-Road Vehicles (e.g., Lawn & Garden, Construction, Farm, & Industrial)
    • Area sources
      • Consumer/Commercial Product Use
      • Natural Gas-Fueled Internal Combustion Engines
      • Gasoline Terminal Loading Loss
      • Cutback Asphalt Paving
      • Architectural Coatings
      • Wood Furniture Coatings
  • Pollutants
    • NOx
    • VOC
    • Urban air toxics (e.g., Houston case study)
example results lawn garden equipment
Example Results: Lawn & Garden Equipment

Based on Frey and Bammi (2002)

probabilistic co emission factors for on road light duty gasoline vehicles mobile5
Probabilistic CO Emission Factors for On-Road Light Duty Gasoline Vehicles (Mobile5)

Based on Frey and Zheng (2002)

moves
MOVES
  • Conceptual Basis for MOVES, the successor to Mobile6 and NONROAD (www.epa.gov/otaq/ngm.htm)
    • “Shootout”
    • NCSU report on modal/binning approach
  • NCSU recommended approaches for quantification of inter-vehicle variability and fleet average uncertainty in modal emission rates and estimates of emissions for driving cycles (details in our report to EPA)
  • EPA requested further assessment of an approximate analytical procedure for propagating error (report by Frey to EPA)
  • At last report, EPA was considering Monte Carlo simulation
probabilistic ap 42 emission factors for natural gas fueled engines july 2000 version
Probabilistic AP-42 Emission Factors for Natural Gas-fueled Engines (July 2000 Version)

aUnits are lb/106 BTU.

bMLE is used for 2SLB engine, MoMM is used for 4SLB engine,

W=Weibull distribution, G=Gamma distribution.

cCalculated based upon bootstrap simulation results.

Based on Frey and Li (2003) (submitted)

slide67
Example of Benzene Emission Factor Category 3b: Nonwinter Storage Losses at a Bulk Terminal : Empirical Distribution
example of benzene emission factor category 3b uncertainty in the mean
Example of Benzene Emission Factor Category 3b: Uncertainty in the Mean

0.06

Uncertainty in mean -73% to +200%

detection limits and air toxic emission factor data
Detection Limits and Air Toxic Emission Factor Data
  • Many air toxic emission factor data contain one or more measurements below a “detection limit”
  • Detection limits can be unique to each measurement because of differences in sample volumes and analytical chemistry methods among sites or contractors
  • A database can contain some non-detected data with detection limits larger than detected values measured at other sites
methodology conventional methods for censored data
Methodology: Conventional Methods for Censored Data
  • Conventional approaches to estimate the mean:

-Remove non detected values (biased)

-Replace values below DL with zero (underestimate)

-Replace values below DL with DL/2 (biased)

-Replace values below DL with DL (overestimate)

  • Cause biased estimates of the mean
  • Does not provide adequate insights regarding
    • Population distribution
    • Unbiased statistics
    • Uncertainty in statistics
methodology quantification of the inter unit variability in censored data
Methodology: Quantification of the Inter-Unit Variability in Censored Data
  • Maximum Likelihood Estimation (MLE) is used to fit parametric distributions to censored data
  • MLE is asymptotically unbiased
  • Fitted distribution is the best estimate of variability
  • Can estimate mean and other statistics from the fitted distribution
  • Can quantify uncertainty caused by random sampling error using Bootstrap simulation
example case study formaldehyde emission factor from external coal combustion
Example Case Study: Formaldehyde Emission Factor from External Coal Combustion
  • 14 data points including 5 censored values
  • Each censored data point has a different detection limit
  • Some detected data values are less than some detection limits
  • There is uncertainty regarding the empirical cumulative probability of such detected data values
results of example case study lognormal distribution representing inter unit variability
Results of Example Case Study: Lognormal Distribution Representing Inter-Unit Variability
results of example case uncertainty in the mean basis to develop probabilistic emission inventory
Results of Example Case: Uncertainty in the Mean (Basis to Develop Probabilistic Emission Inventory)

Uncertainty in mean -77% to +208%

mixtures of distributions
Mixtures of Distributions

Percent of data in 50% CI: 92%

Percent of data in 95% CI: 100%

case study
Case Study
  • Charlotte modeling domain
  • 32 units from 9 different coal-fired power plants
  • 1995 and 1998 data used
  • Propagation of uncertainty investigated using July 12 – July 16 1995 meteorological data
  • Data available for emission and activity factors
  • Vector autoregressive time-series modeling of emissions from each unit
time series and uncertainty
Time Series and Uncertainty

Different uncertainty ranges for different hours of day

emission factors and inventories findings 1
Emission Factors and Inventories:Findings (1)
  • Visualization of data used to develop an inventory is highly informative to choices of empirical or parametric distribution models for quantification of variability or uncertainty
  • A key difficulty in developing probabilistic emission factors inventories is to find the original data used by EPA and others.
    • When data are found, they are typically poorly documented.
    • The time required to assemble databases when original data could not be found was substantial
  • Test methods used for some emission sources are not representative of real world operation, implying the need for real world data and/or expert judgment when estimating uncertainty
  • Uncertainty in measurement methods is not adequately reported. There is a need for more systematic reporting of the precision and accuracy of measurement/test methods
  • Emissions databases should not be arbitrarily fragmented into too many subcategories. Conversely, subcategories should be created when there is a good (empirical) basis for doing so.
emission factors and inventories findings 2
Emission Factors and Inventories:Findings (2)
  • Uncertainties in emission factors are typically positively skewed, unless the uncertainties are relatively small (e.g., less than about plus or minus 30 percent)
  • Uncertainty estimates might be sensitive to the choice of parametric distribution models if there is variation in the goodness-of-fit among the alternatives. However, in such cases, there is typically a preferred best fit. When several alternative models provide equivalent fits, results are not sensitive to the choice of the model
  • The quantifiable portion of uncertainty attributable to random sampling error can be large and should be accounted for when using emission factors and inventories
  • Variability in emissions among units could be a basis for assessing the potential of emissions trading programs
emission factors and inventories findings 3
Emission Factors and Inventories:Findings (3)
  • Intra-unit dependence in hourly emissions is significant for some sources (e.g., power plants), including hourly and daily lag effects
  • Inter-unit dependence in emissions is important for some sources, such as power plants
  • Range of variability and uncertainty is typically much greater as the averaging time decreases
  • Even for sources with continuous emissions monitoring data, there is uncertainty regarding predictions of future emissions that can be informed by analysis of historical data
  • Prototype software demonstrates the feasibility of increasing the convenience of performing probabilistic analysis
  • Uncertainties in total inventories are often attributable to just a few key emission sources
mobile5 and mobile6 findings
Mobile5 and Mobile6Findings
  • Range of variability and uncertainty in correction factors (e.g., temperature) dominate and are large
  • Uncertainties in average emissions are large in some cases (e.g., -80 to +220 percent): normality assumptions are not valid
  • Asymmetry in uncertainties associated with non-negative quantities and large inter-unit variability
  • Sensitivity analysis was used to identify key sources of uncertainty and recommend future data collection priorities in order to reduce uncertainty
  • There is a proliferation of driving cycles. Some of them are redundant and, therefore, unnecessary
  • When comparing model predictions with validation data, or when comparing models with each other, their prediction uncertainty ranges should be considered
  • It is difficult to do an uncertainty analysis for a model such as Mobile5 or Mobile6 after the fact, but would be much easier if integrated into the data management and modeling approach.
air quality modeling
Air Quality Modeling
  • Widely used for:
    • Assessment of regulatory options
    • Identification and evaluation of control strategies (which pollutants, how much control, where to control?)
    • Emissions permit applications
    • Identification of contributing factors to local, urban, and regional air quality problems
    • Human exposure assessment
probabilistic modeling
PROBABILISTIC MODELING

Input

Uncertainties

Output

Uncertainties

Emissions

Peak Ozone

Chemistry

Variable-Grid

Urban Airshed Model

(UAM-V)

Local Ozone

Meteorology

Local NOx

Initial & Boundary

Conditions

Local VOC

case study of hanna et al 2001
Case Study of Hanna et al. (2001)
  • OTAG Modeling Domain (eastern U.S.)
  • UAM-V model with Carbon Bond-IV mechanism
  • Uncertainty in peak ozone concentrations
  • Assessment of effects of 50% reductions in each of NOx and VOC emissions
  • Quantification of uncertainty in 128 inputs
    • Literature review for chemical kinetic rate constants
    • Expert elicitation for emissions, meteorological, and initial and boundary condition inputs
  • Monte Carlo simulation with n=100
  • Correlations used to identify key sources of uncertainty
key findings of case study of hanna et al 2001
Key Findings of Case Study of Hanna et al. (2001)
  • It was feasible to perform Monte Carlo simulation on UAM-V for the OTAG domain and a seven day episode
  • Simulation results include base case uncertainty estimates for ozone concentrations and estimates of differences in ozone concentrations because of emissions reduction strategies
  • There was less uncertainty in estimates of differences in concentration than in absolute estimates of total concentration
  • Reductions in NOx emissions led to higher estimated reductions in O3 than did reductions in VOC emissions. This is consistent with the expectation that most of the domain is NOx-limited.
  • Key uncertainties include NOx photolysis rate, several meteorological variables, and biogenic VOC emissions
  • Compared to Hanna et al. (1998), there was more disaggregation of uncertainty estimates for emission sources, and this may tend to weaken the sensitivity to any one source. It is possible, however, that model outputs would be more sensitive to an aggregated collection of emissions sources.
  • There is a need for improved methods of uncertainty estimation, particularly for the chemical mechanism and the meteorological fields, and for better accounting of correlations and dependencies (e.g., temperature dependence of biogenic emissions).
case study of abdel aziz and frey 2004
Case Study of Abdel-Aziz and Frey (2004)
  • Focus was on evaluating implications of uncertainties in hourly coal-fired power plant NOx emissions with respect to ozone for the Charlotte, NC domain
  • Key questions:
    • (1) what is the uncertainty in ozone predictions solely attributable to uncertainty in coal-fired utility NOx emissions?;
    • (2) can uncertainties in maximum ozone levels be attributed to specific power plant units?;
    • (3) how likely is it that National Ambient Air Quality Standards (NAAQS) will be exceeded?; and
    • (4) how important is it to account for inter-unit correlation in emission uncertainties
slide101

Location of Power Plant Impact

Analysis of Correlation in Emissions versus Ozone Levels in a Specific Grid Cell Can Detect Influence of a Specific Plant

key findings from abdel aziz and frey 2004 study
Key Findings from Abdel-Aziz and Frey (2004) study
  • The uncertainty in maximum 1-hour ozone predictions is potentially large enough to create ambiguity regarding compliance with the NAAQS for any given emissions management strategy.
  • Control strategies can be developed to achieve attainment with an acceptable degree of confidence, such as 90 or 95 percent.
  • There was a substantial difference in results when comparing “independent” versus “dependent” units – thus, it can be important to a decision to account for dependencies between units
  • Probabilistic air quality modeling results provide insight regarding where to site monitoring stations and regarding the number of stations needed
  • Under the old 1-hour standard, uncertainties in the maximum domain-wide ozone levels could be traced to an individual power plant, thereby implying that control strategies must include that plant.
  • Under the new 8-hour standard, uncertainties in maximum ozone levels are attributable to many plants, implying the need for a more widespread control strategy
risk assessment modeling
Risk Assessment Modeling
  • Risk assessment is growing in importance as a basis for regulatory decision making
    • E.g., Phase 2 of MACT standards
    • Urban air toxics
    • Food safety and international trade
    • (etc.)
human exposure and risk analysis
Human Exposure and Risk Analysis
  • Over the last 10-15 years, there has been growing acceptance and incorporation of probabilistic approaches to dealing with inter-individual variability and uncertainty
  • EPA has issued various guidance
  • International guidance: e.g., FAO/WHO
e coli o 157 in ground beef risk assessment model

Exposure Assessment

Production

On-Farm

Transport

Marketing

of Live

Animals

Slaughter

Dehiding

Evisceration

Splitting

Chilling

Fabrication

Preparation

Grinding

Growth

Cooking

Consumption

E. Coli O:157 in Ground Beef Risk Assessment Model

Scope

Hazard Identification

  • Slaughter
  • Production
  • Preparation
  • Dose-response

Dose-Response

Assessment

Morbidity

Mortality

findings based upon risk assessment
Findings Based Upon Risk Assessment
  • Risk assessment applications often differ from others because of distinction between inter-individual variability and uncertainty
  • A two-dimensional probabilistic simulation framework is used
  • Expert judgment is inherent in the process of fitting distributions to data for variability and is often used to estimate uncertainty in the parameters of such distributions
  • Sensitivity analysis is critical to interpretation of risk assessment results:
    • Assists in risk management decision-making
    • Prioritize future work to improve the assessment
  • There was a lack of practical guidance regarding sensitivity analysis of risk models that has been addressed by recent work
general recommendations 1
General Recommendations (1)
  • Uncertainty and sensitivity analysis should be used to answer key decision maker and stakeholder questions, e.g.,:
    • prioritize scarce resources toward additional research or data collection
    • make choices among alternatives in the face of uncertainty,
    • evaluate trends over time, etc.
  • Where relevant to decision making, uncertainty and sensitivity analysis should be included as functional requirements from the beginning and incorporated into model and input data development
  • There should be minimum reporting requirements for uncertainty in data (e.g., summary statistics such as mean, standard deviation, sample size)
  • Federal agencies should continue to improve documentation and accessibility of models and data for public peer review
general recommendations 2
General Recommendations (2)
  • Foster greater acceptance of appropriate methods for including, documenting, and reviewing expert judgment in regulatory-motivated modeling and analysis
  • There is a need for flexibility since there are many possible approaches to analysis of uncertainty and sensitivity. Specific choices should be appropriate to assessment objectives, which are typically context-specific
  • Human resources for modeling, including uncertainty and sensitivity analysis, should be appropriately committed.
    • Adequate time and budget to do the job right the first time (could save time and money in the long run)
    • Adequate training and peer review
    • Promote workshops and other training opportunities, and periodic refinement of authoritative compilations of techniques and recommended practice
general recommendations 3
General Recommendations (3)
  • Software tools substantially facilitate both uncertainty and sensitivity analysis (e.g., Crystal Ball) but in some ways also limit what is done in practice – there is a long-term need for software tools appropriate to specific types of applications
  • Some areas need more research – e.g., best techniques for communication, real-world information needs for decision makers
  • The relevance of analyses to decision making needs to be emphasized and considered by analysts
  • Decision makers need or should have access to information on why/how they should use probabilistic results
  • A multi-disciplinary compilation of relevant case studies and insights from them is a useful way to help convince others of the value of doing uncertainty and sensitivity analysis
  • Uncertainty and sensitivity analysis should be an open and transparent process that can be subject to scrutiny and peer review