1 / 63

CCAMP – Past, Present, and Future: Central Coast Ambient Monitoring Program

Learn about the goals, approach, and accomplishments of the Central Coast Ambient Monitoring Program (CCAMP) and how their data can inform decision-making. Discover where CCAMP plans to go in the future.

mhang
Download Presentation

CCAMP – Past, Present, and Future: Central Coast Ambient Monitoring Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CCAMP – Past, Present, and FutureCentral Coast Ambient Monitoring ProgramCentral Coast Regional Water Quality Control Board

  2. ABOUT CCAMP Central Coast Ambient Monitoring Program • Program goals and objectives • CCAMP Approach • What we have accomplished • What our data can tell us • Where we would like to go

  3. Program Goals and Objectives

  4. Original CCAMP Program Goals • Determine status and trends of surface, ground, estuarine, and coastal water quality and associated beneficial uses • Provide water quality information to users in accessible forms to support decision making • Collaborate with other monitoring programs to provide effective and efficient monitoring

  5. AB982 established SWAMP in 2000 Legislative Report defined questions and objectives based on Beneficial Use impairment • Is it safe to swim? • Is it safe to drink the water? • Is aquatic life protected? • Is it safe to eat the fish and shellfish? etc…

  6. Is there evidence that it is unsafe to swim? • Beneficial Use: Water Contact Recreation • Objective: Screen for indications of bacterial contamination by determining percent of samples exceeding adopted and mandated water quality objectives Monitoring Approach: Monthly monitoring for indicator organisms; compilation of other data • Assessment Limitations: CCAMP currently samples for fecal and total coliform • Criteria (90% probability): • 10% over 400 MPN fecal coliform • 10% over 235 MPN E. coli • 10% over 104 MPN Enterococcus (bays and estuaries) • Fecal to Total coliform ratio over 0 .1 when Total Coliform exceeds 1000 MPN (bays and estuaries)

  7. Scientific and Technical Planning and Review • SWAMP Scientific Planning and Review Committee (SPARC) • Statewide and regional review this year • Local input each year from “rotational” TAC

  8. What we have accomplished

  9. We now have… • Funding • Six years of data • Website and data tools • CCLEAN • Agricultural Monitoring Program • Stormwater monitoring coordination • Characterization reports

  10. CCAMP Approach • Watershed rotation • Tributary-based site placement • Multi-indicator, “weight of evidence” • Integrator sites

  11. Tissue Bioaccumulation Water Column Toxicity and TIEs Sediment Toxicity and Chemistry Multiple lines of evidence • Monthly Conventional Water Quality • Benthic Invertebrate and Habitat Assessment

  12. Watershed Rotation Each watershed area is monitored and evaluated every 5 years

  13. Coastal Confluences Thirty-three coastal sites are monitored continuously above salt water influence

  14. Robust Time series Unsafe to drink?

  15. Same data….

  16. Web Site • Used extensively • When the server is down we receive “complaints” • SIMON • Templates for volunteer and ag websites • New mapping tools!

  17. What our data can tell us

  18. A few data uses… • IDing problem areas for NPS activities • Ag monitoring program development • 303(d) listing, 305(b) assessment, TMDL development and compliance monitoring • Basin Plan standards evaluation • Statewide nutrient objectives work • Public interaction

  19. CCAMP Tools • Data analysis • Quality Assurance • Biostimulatory Risk Index • Pesticide Risk Index • Index of Biotic Integrity

  20. Biostimulatory Risk Index Limekiln Creek (Rank = 0.0) • Ranking includes: • % algal/plant cover • Chlorophyll a • Nitrogen/phosphorus • Range of DO • Range of pH Franklin Creek (Rank = 85.3)

  21. Biostimulatory Risk Scores (shown as quartiles)

  22. Pesticide Risk Index Based on pounds applied and chemical diversity • Pesticide applications are a predictor of toxicity • In press, by Hunt and Anderson

  23. Pesticide Use and Nitrate

  24. Bioassessment • Biological endpoint for cumulative impacts • Growing regulatory use throughout the state and nation • Index of Biotic Integrity • Development of biocriteria for Region

  25. SoCal IBI vs CCAMP IBI

  26. CCAMP IBI vs Biostim Risk Index

  27. Where we would like to go

  28. 5-year plan • Continually increase data use and utility (change detection, trends) • Publish findings and tools • “Institutionalize” data management approach; provide better support to partner organizations • Biocriteria and reference conditions • Augment monitoring efforts to support new office vision

  29. Conservation Ethic • New RWQCB focus on riparian and wetland protection; pervious surface protection • CCAMP will help prioritize effort, document trends, and track progress.

  30. New SWAMP Protocols • SWAMP physical habitat protocols provide a more complete picture of habitat quality • Canopy cover and complexity • Riparian width • In-channel habitat • Pebble counts etc…

  31. California Rapid Assessment Method for Wetlands (CRAM) • Developed by a large multi-agency effort including CCC, EPA, SCCWRP, and SFEI • Piloted in the Morro Bay watershed • We hope to collaborate with CCC staff to use CRAM at our monitoring sites this season

  32. CRAM sample map Lower Carmel River

  33. Delineation of corridors and impervious surfaces using imageryNOAA Coastal Change Analysis Program

  34. CCAMP Web Site www.CCAMP.org

  35. Data QualityTips for getting it, keeping it, proving it! Central Coast Ambient Monitoring Program

  36. Why do we care? • Program goals are key – what do you want to do with the data? • Data utility increases directly with quality • High quality data can support 303(d) listing decisions, regulatory decisions, management decisions • High quality data will be taken seriously • Useable data is well-documented data

  37. Recent 303(d) listing guidance checks that data used to support a listing be associated with quality assurance program plan

  38. Quick Overview • Data quality objectives • Data validation and verification • Quality assurance documents • Other QA tools

  39. Data and Measurement Quality Objectives • Basis of data quality determinations • Underlies the concept of “SWAMP Compatibility” • Should be defined in an approved Quality Assurance Program Plan

  40. Basic QA terms describing data quality: • Precision • Accuracy and bias • Representativeness • Completeness • Comparability • Sensitivity

  41. Precision High Low High Bias Inaccurate Highly Inaccurate! Low Accurate! Inaccurate

  42. Determining Precision • Precision is a measure of repeatability and consistency • Precision is calculated from sample replicates and/or splits • Precision is dependent on instrument or method, and variability from handling and the environment • You should be collecting at least 5% duplicates (or one per event) to be “SWAMP compatible”

  43. Determining Precision For multiple samples calculate Relative Standard Deviation (RSD):RSD = (s / x) * 100 = % deviation of measurements For pairs of samples calculate Relative Percent Difference (RPD): RPD = (X1 – X2) * 100 where X1 is larger value (X1 + X2) / 2 = % deviation of measurements SWAMP RPD requirement is typically 25%

  44. What’s the difference between a field duplicate and a field split? Field duplicates: collected side by side, a combined measurement of environmental variability and sampling and laboratory error Field splits: collected from a composite sample, factors out environmental variability. A measure of variability associated with sampling and laboratory error. May be sent to different laboratories to measure interlab variability.

  45. Bias and Accuracy……how to hit the target! • Calibrate against standard reference material • Calibrate against an “expert” • Spike sample with a known concentration

  46. Calculating accuracy • SWAMP requires that measured value is between 80% and 120% of the known value PD = (X1 – X2) * 100 where X1 is known value (X1) = % deviation of measurement from reference

  47. Calibrating instruments • In order to track equipment accuracy, keep an instrument calibration log • Instrument drift is determined by calibrating pre- and post- sampling. RPD should be within DQO requirements. • Be sure to record instrument ID in your data records; in some cases bias can be adjusted out of data if it can be shown to be consistent.

  48. Blanks • Assessing contamination • Different types for different purposes • Field, equipment, travel, lab • SWAMP suggests CWQ field blank during periodic field audit. If inadequate performance, increase to 5% • Blank must be < Minimum Detection Limit

  49. Representativeness Think through your sampling design What are you trying to show? Consider: Statistical adequacy Seasonality Time of day Spatial variability

More Related