1 / 40

Quality Assurance/Quality Control and Quality Assurance Project Plans

Quality Assurance/Quality Control and Quality Assurance Project Plans. Greg Thoma University of Arkansas IPEC Quality Assurance Officer. Quality Assurance/Quality Control. QA is management of the data collection system to assure validity of the data. Organization & responsibilities

milt
Download Presentation

Quality Assurance/Quality Control and Quality Assurance Project Plans

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance/Quality ControlandQuality Assurance Project Plans Greg Thoma University of Arkansas IPEC Quality Assurance Officer

  2. Quality Assurance/Quality Control • QA is management of the data collection system to assure validity of the data. • Organization & responsibilities • QC refers to technical activities which provide quantitative data quality information. • Data quality indicators, Calibration procedures. • Quality Assurance Project Plan • Document that provides the details of QA & QC for a particular project

  3. Quality? • How good is “good enough”? 99.9% of the time? • 1 hour unsafe drinking water a month • 22,000 checks deducted from the wrong account an hour • 16,000 pieces of lost mail an hour • What does data quality mean? • Universal standard? Relative measure? • The goal of generators of environmental data should be to produce data of known quality to support environmental decisions • Is the site clean? • Does the technology work?

  4. Scientific Method Invent a tentative theory or hypothesis consistent with the observations Observe something interesting Test the predictions with planned experiments Use the hypothesisto make predictions Discrepanciesbetweenobservationandtheory? How do you know if there are discrepancies? Uncertainty in observed valued reduces the ability to discriminate differences. No Yes Modify the hypothesis in light of the results Conclude the theory is true

  5. Data Life Cycle

  6. Performance and Acceptance Criteria • Performance criteria address the adequacy of information that is to be collected for the project. • “Primary” data. • Acceptance criteria address the adequacy of existing information proposed for inclusion in the project. • “Secondary” (literature) data.

  7. Performance and Acceptance Criteria • Effective data collection is rarely achieved in a haphazard fashion. • The hallmark of all good projects, studies, and decisions is a planned data collection. • A systematic process leads to the development of acceptance or performance criteria that are: • based on the ultimate use of the data to be collected, and • define the quality of data required to meet the final project objectives. QAG/4A

  8. Performance and Acceptance Criteria • The PAC development process helps to focus studies by encouraging experimenters to clarify vague objectives and explicitly frame their study questions. • The development of PAC is a planning tool that can save resources by making data collection operations more resource-effective.

  9. PAC Process at Project Level • State the problem • Oil contaminated soil needs to be remediated • Identify the study questions • Testable hypotheses rather than general objectives • We hypothesize that the contaminated soil, under nutrient rich conditions, will exhibit the highest rates of degradation due to the history of hydrocarbon exposure these microbial communities have experienced. • Establish study design constraints • Budget, timeline, spatial extent, technical issues, etc. • 7 factors, 2 levels, 4 reps, 8 sample times!!!!

  10. PAC Process at Project Level • Identify data requirements • What needs to be measured? Soil properties, nutrient status, contaminant level, etc. • Specify information quality • May be qualitative • Representativeness, comparability • or quantitative • DQI: precision, bias, accuracy, and sensitivity • Strategy for information synthesis • How will it be analyzed? AVOVA? Regression? • Optimize experimental design • Get ‘good enough’ data at the lowest cost

  11. QA in Your Future? • Intergovernmental Data Quality Task Force: • Uniform Federal Policy for Implementing Environmental Quality Systems • Joint initiative between the EPA, DoD, and DOE to resolve data quality inconsistencies and/or deficiencies to ensure that: • Environmental data are of known and documented quality and suitable for their intended uses, and • Environmental data collection and technology programs meet stated requirements. • And don’t forget TQM, ISO9000, & Six Sigma!

  12. A Graded Approach • The level of planning detail and documentation may: • correspond to the importance of the project to its stakeholders • e.g. significant health risks associated. • reflect the overall scope and budget of the effort • Superfund cleanup vs. proof-of-concept research • be driven by the inherent technical complexity or the political profile of the project • complex or politically sensitive projects generally require more documentation.

  13. Quality Assurance Project Plan • Documentation of routine laboratory practice • Elements • A. Project Management • B. Data Generation and Acquisition • C. Assessment and Oversight • D. Data Validation and Verification

  14. Group A. Project Management • Title Page • Signature Approval Sheet • Table of Contents • Distribution List • Project/Task Organization • Problem Definition/Background • Project/Task Description and Schedule • Quality Objectives (linked to PAC) • Special Training Requirements/Certification • Documentation and Records

  15. Performance Criteria for Phytoremediation Project

  16. Performance Criteria for Phytoremediation Project Acceptance criteria will be developed for published meteorological data and data generated in other studies used in the modeling for this project.

  17. Data Quality Indicators • Bias: systematic factor causing error in one direction • Precision: agreement of repeated measures of the same quantity • Accuracy: combination of precision and bias • Representativeness: how well the sample represents the population • Comparability: how well two or more datasets may be combined • Completeness: measure of the amount of valid data to the total planned collection of data. • Sensitivity: separating the signal from the noise

  18. Accuracy

  19. Components of Variability

  20. Representativeness • Extremely important • NAAQS sampling next to a bus stop?? • Stack gas monitoring – isokinetic sampling • Sampling plan design • Number and locations • Size and sampling method and handling • Grab vs. composite, preservation methods, etc.

  21. Group B. Measurement/Data Acquisition • Experimental Design • Sampling Methods Requirements • Sample Handling and Custody Requirements • Analytical Methods Requirements • Quality Control Requirements • Instrument/Equipment Testing, Inspection, and Maintenance Requirements • Instrument Calibration and Frequency • Inspection/Acceptance Requirements for Supplies • Data Acquisition Requirements (Non-direct Measurements) • Data Management

  22. Sample Handling and Preservation

  23. Quality Control Checks

  24. Impact of Detection Limit and Contaminant Concentration on Reporting

  25. MDL and False Positive Errors For 7 injections, t = 3.71

  26. MDL and False Negative Errors

  27. Group C. Assessment and Oversight • Assessments and Response Actions • Procedures for monitoring data quality as it is collected • Actions to be taken in the event of failure to meet performance criteria • Stop analysis, correct problem, reanalyze • Reports to Management

  28. Group D.Data Validation and Usability • Data review, verification, and validation • Review • Check for transcription or data reduction errors and completeness of QC information. • Verification • Were the procedures in the QAPP accurately followed? • Validation • Does the data meet the PAC specified in the QAPP? • Reconciliation with user requirements • Is the data suitable for use by decision makers?

  29. Data Quality Assessment (DQA): • The DQA process is a quantitative process • Based on statistical methods • Does set of data support a particular decision with an acceptable level of confidence? • 5 Steps: • Review the PAC and sampling design; • Conduct a preliminary data review; • Select the statistical test; • Verify the assumptions of the statistical test; and • Draw conclusions from the data.

  30. Example Quality Control Charts RPD = %R =

  31. Surrogate Recovery Example Decane recovery (%) QC batch number A.Apblett , “Novel materials for facile separation of petroleum products from aqueous mixtures via magnetic filtration”

  32. Benefits of Up-front Systematic Planning • Focused data requirements and optimized design for data collection; • Use of clearly developed work plans for collecting data in the field; • A well documented basis for data collection, evaluation, and use; • Clearer statistical analysis of the final data; • Sound, comprehensive QA Project Plans.

  33. Benefits of QA • Clear lines of responsibility • Documented training and analytical competence • Standard procedures to assure data comparability • Catch and correct subtle mistakes/errors

  34. Conclusions • Why go through the hassle & headache? • QA/QC is just good science. • Documented, defensible data. • It is cheaper to do it right the first time. • Your next proposal will be better too!

  35. Website • Virtually all roads lead to: • www.epa.gov/quality

  36. Data Acquisition • Experimental Design • Will the results allow assessment of the hypothesis? • Sampling Methods • Is it representative? • How is it preserved? Transported? • Cross contamination

  37. Data Acquisition (cont) • Analytical Measurement Methods • Quality Control • Calibration • Bias & Precision • Blanks, Duplicates, Spikes • Instrument Control

  38. Project Management • Organization & Responsibilities • Quality Objectives & Criteria • What do you want to know? (Hypothesis) • What are you measuring and how ‘good’ the data needs to be. • Record Keeping • Lab, Field, Instrument notebooks

  39. QA Plan for Development of Models • Project Description • Model Description - Conceptual Model • Computational Aspects • Data Source/Quality/Input‑Output • Model Validation • Model Application

  40. Common Mistakes in MDL Determination • Miscalculation • Incorrect standard deviation • Incorrect degrees of freedom • Insufficient replicates (need 7) • Spike out of range • Lowest standard too far from MDL • Using method based MDL w/o verification of validity for current matrix

More Related