Development of chemistry indicators
Download
1 / 34

Development of Chemistry Indicators - PowerPoint PPT Presentation


  • 54 Views
  • Uploaded on

Development of Chemistry Indicators. Steven Bay Southern California Coastal Water Research Project (SCCWRP) [email protected] Presentation Overview. Workplan update and response to comments Project status Preliminary results Data screening Normalization SQG comparison.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Development of Chemistry Indicators' - blaze-rowe


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Development of chemistry indicators

Development of Chemistry Indicators

Steven Bay

Southern California Coastal Water Research Project (SCCWRP)

[email protected]


Presentation overview
Presentation Overview

  • Workplan update and response to comments

  • Project status

  • Preliminary results

    • Data screening

    • Normalization

    • SQG comparison


Chemistry indicators
Chemistry Indicators

  • A methodology for interpreting sediment chemistry data relative to impacts on benthic organisms (e.g., an SQG approach with numeric values)

  • Link to pollutants of concern

  • Familiar approach

  • Many available data

  • Several challenges to effective use

    • Bioavailability

    • Unmeasured chemicals

    • Mixtures


Objectives
Objectives

  • Identify important geographic, geochemical, or other factors that affect relationship between chemistry and effects

  • Develop indicator(s) that reflect contaminant exposure

  • Develop indicator(s) that are protective and predictive of impacts

  • Develop thresholds for use in MLOE framework


Approach
Approach

  • Develop a database of CA sediment quality information for use in developing and validating indicators

    • Address concerns and uncertainty regarding influence of regional factors

    • Document performance of recommended indicators

  • Develop both empirical and mechanistic indicators, if possible

    • Both types have desirable attributes for SQO use

    • Investigate existing and new approaches

    • Emphasis is on priority chemicals identified as likely causes of impairment


Approach1
Approach

  • Evaluate SQG performance

    • Use CA data

    • Use quantitative and consistent approach

    • Select methods with best performance for expected applications

  • Describe response levels (thresholds)

    • Consistent with needs of MLOE framework

    • Based on observed relationships with biological effects


Ssc comments
SSC Comments

  • More detail needed regarding data screening, matching, establishment of validation dataset

  • Lack of clarity regarding the respective roles of empirical and mechanistic guidelines

    • Approaches not interchangeable

    • How will mechanistic guidelines be developed/validated?

    • Should use all available approaches, but how?

An evolving and thorough process, an overview is included in this presentation

A conceptual plan is included in this presentation, your input is welcome


Ssc comments1
SSC Comments

  • Clarify how metals normalization results will be used

  • Provide greater independence of chemistry line of evidence

  • More detail needed regarding calibration of guidelines and comparison of performance (within CA and nationally)

Will explore utility in improving guideline performance and establishing background concentrations

Agree this is an important goal, part of motivation for using mechanistic guidelines and metal normalization

A revised comparison approach is proposed that is more consistent with MLOE framework


Tasks
Tasks

  • Prepare development and validation datasets

  • Develop and refine SQGs

  • Evaluate SQGs

  • Describe response levels


Task 1 prepare datasets
Task 1: Prepare Datasets

Substantial progress made

Create high quality standardized datasets for development and validation activities

  • Evaluate data quality and completeness

    • matched chemistry and biology

    • Appropriate habitat

    • Data quality, nondetects

  • Calculate derived values

    • e.g., sums, means, quotients

  • Normalize data

    • e.g., metals, TOC

  • Stratify and subset data

    • Independent validation data

    • Address geographic or mixture patterns



Data screening
Data Screening

  • Appropriate habitat and geographic range

    • Subtidal, embayment, surface sediment samples

  • Chemistry data screening

    • Valid data (from qualifier information)

    • Estimated nondetect values

    • Completeness (metals and PAHs)

  • Toxicity data screening

    • Target test method selection

    • Valid data (control performance)

    • Lack of ammonia interference

  • Selection of matched data

    • Same station, same sampling event


Bay estuary samples in database after screening
Bay/Estuary Samples inDatabase After Screening


Validation dataset
Validation Dataset

  • Used to confirm performance of recommended SQGs

  • Independent subset of SQO database

  • Approximately 30% of data, selected randomly to represent contamination gradient

  • Includes acute and chronic toxicity tests


Metal normalization
Metal Normalization

  • Metals occur naturally in the environment

    • Silts and clays have higher metal content

    • Source of uncertainty in identifying anthropogenic impact

    • Background varies due to sediment type and regional differences in geology

  • Need to differentiate between natural background levels and anthropogenic input

    • Investigate utility for empirical guideline development

    • Potential use for establishing regional background levels


Reference element normalization
Reference Element Normalization

  • Established methodology applied by geologists and environmental scientists

  • Reference element covaries with natural sediment metals and is insensitive to anthropogenic inputs

  • Use of iron as reference element validated for southern California

    • 1994 and 1998 Bight regional surveys


Reference element normalization1

Chromium (mg/kg)

Iron (%)

Reference Element Normalization


Reference element normalization2

Chromium (mg/kg)

Iron (%)

Reference Element Normalization


Reference element normalization3
Reference Element Normalization

Nickel

Copper

Use iron:metal relationships to:

Estimate amount of anthropogenic metal for use in SQG development

Identify background metal concentrations


Task 2 develop refine sqgs
Task 2: Develop/Refine SQGs

Work in progress

Investigate a variety of approaches or refinements and pursue those with the best potential for success.

Focus on mixture models, empirical and mechanistic

  • Apply existing approaches (off the shelf)

  • Refine existing approaches

  • Calibrate existing approaches

  • Develop new approaches



Mechanistic vs empirical sqgs
Mechanistic vs. Empirical SQGs

  • Differences in utility for predicting impacts and determining causation

  • Both types of information needed for interpretation of chemistry data

    • Mechanistic SQG results will be useful for subsequent applications needing to identify cause of impairment

  • Anticipate chemistry LOE score will be based on combination of SQGs

    • Complementary, not interchangeable

    • Several strategies possible, looking for input on recommended approach


Proposed scoring for multiple sqgs
Proposed Scoring For Multiple SQGs


Guideline calibration
Guideline Calibration

  • Use of CA chemistry/effects data to adjust empirical guideline models or thresholds

    • LRM: model and thresholds

    • Effects range: CA-specific values and thresholds

    • AET: CA-specific values

    • Consensus & SQGQ-1: thresholds

  • Comparisons between existing and calibrated SQG results used to guide recommendations

    • Only use calibrated values if improved performance can be demonstrated


Task 3 evaluate approaches
Task 3: Evaluate Approaches

Work in progress

Document and compare performance of candidate SQGs approaches in a manner relevant to desired applications

  • Compare overall discriminatory ability

  • Identify applications

  • Quantify performance

    • Validation dataset

    • Standardized measures

  • Compare performance and identify the most suitable approaches


Performance comparison
Performance Comparison

  • Approach

    • Focus on empirical guidelines

    • Compare among candidates to select a short list

    • Compare to existing approaches to evaluate need for new/calibrated approaches

  • Previous strategy for comparison

    • Current work plan: Binary evaluation (effect/no effect)

    • Calculate several measures of performance


Performance measures

C

D

B

A

Performance Measures

Negative Predictive Value =C/(C+A) x 100(percent of no hits that are nontoxic)

Specificity=C/(C+D) x 100(percent of all nontoxic samples that are classified as a no hit)

Positive Predictive Value =B/(B+D) x 100(percent of hits that are toxic)

Sensitivity=B/(B+A) x 100(percent of all toxic samples that are classified as a hit)


Performance comparison1
Performance Comparison

Proposed revised strategy

  • Evaluate ability to classify stations into multiple categories

    • More consistent with MLOE approach

    • Less reliance on a single threshold

    • Magnitude of error affects score

  • Utilize both toxicity and benthic impact data


Sqg 1

Observed Toxicity

Predicted Effect From SQG

High

Moderate

Marginal

Reference

High

60

30

20

1

Moderate

33

50

25

0

Marginal

10

14

65

6

Reference

3

7

20

25

SQG 1


Kappa statistic
Kappa Statistic

  • Developed in 1960-70’s

  • Used in medicine, epidemiology, & psychology to evaluate observer agreement/reliability

    • Similar problem to SQG assessment

    • Can incorporate a penalty for extreme disagreement

  • Sediment quality assessment is a new application


Sqg 1 good association between adjacent categories

Observed Toxicity

Kappa = 0.48 

Predicted Effect From SQG

High

Moderate

Marginal

Reference

High

60

30

20

1

Moderate

33

50

25

0

Marginal

10

14

65

6

Reference

3

7

20

25

SQG 1(good association between adjacent categories)


Sqg 2 poor association between adjacent categories

Observed Toxicity

Kappa = 0.27 

Predicted Effect From SQG

High

Moderate

Marginal

Reference

High

60

1

20

30

Moderate

33

50

0

25

Marginal

14

10

65

6

Reference

20

7

3

25

SQG 2 (Poor association between adjacent categories)


Task 4 describe response levels
Task 4: Describe Response Levels

Methodology under development

Determine levels of response for the recommended SQG approaches

  • Relate SQGs to biological effect indicator responses (benthos & toxicity)

    • May use statistical methods to optimize thresholds

  • Select response levels that correspond to objectives for performance and beneficial use protection


Summary
Summary

  • Work on many key elements underway

    • Priority is to build upon existing approaches

    • Many of the technical obstacles have been dealt with

  • Overall approach is consistent with SSC recommendations

    • Include empirical and mechanistic approaches

  • Expect to succeed in selecting recommended SQGs for use in MLOE framework

  • Much work remains, especially for development of thresholds


ad