1 / 19

Quality Assurance and Control

Quality Assurance and Control. Objectives. To define and discuss quality control To discuss the key features of the design of epidemiologic studies To discuss data control instruments To discuss training of staff issues. QualityAssurance. Steps in Quality Assurance Specify study hypothesis

burton
Download Presentation

Quality Assurance and Control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quality Assurance and Control

  2. Objectives • To define and discuss quality control • To discuss the key features of the design of epidemiologic studies • To discuss data control instruments • To discuss training of staff issues

  3. QualityAssurance • Steps in Quality Assurance • Specify study hypothesis • Specify general design to test study hypothesis (study protocol) • Choose and prepare specific instruments (develop operation manuals) • Train staff (certify staff) • Using trained staff, pretest and pilot-study data collection • If necessary, modify 2 and 3

  4. Key features of study design (1989 Kahn and Sempos) • Formulation of the main hypothesis • A priori specification of potential confounding variables • Definitions of the characteristics of the study population • Definition of the design strategy for internal validity • Definitions of the design strategy for reliability and validity • Specifications of the study power • Standardization of procedures • Activities during data collection • Data analysis • Reporting of data

  5. Some quantitative measures of validity and reliability • Validity • Sensitivity • Specificity • Predictive value positive • Predictive value negative • Reliability • Youden’s J statistic • Kappa scores

  6. Example of temporal drift in measurement

  7. Phantom measurements

  8. Predictive Values at Different Prevalence Rates with Sensitivty .90 and Specificity .90 Prev 10% PPV .50 NPV .99 Prev 25% PPV .76 NPV .96 Prev 50% PPV .90 NPV .90

  9. Spectrum of severity

  10. Predictive Values

  11. Kappa Statistic • po = observed probability of concordance between the two surveys • pe = expected probability of concordance between the two surveys • The standard error of the Kappa statistic is calculated by: • To test the hypothesis Ho:k=0 vs. H1:k>0, use the test statistic:

  12. Percent agreement

  13. Figure 1. Association of average faculty performance rating (from 1, bottom 20%, to 5, top 20%) and absolute rank on the National Resident Matching Program (NRMP) list (r = 0.19; P =.11).

  14. Table 1. Discrepancy Between the DIS and SCAN for the Lifetime Occurrence of Depressive Disorder in the Baltimore ECA Follow-up*

  15. Table 1-Comparison of WHO and ADA diagnostic categories for undiagnosed diabetes From:   Lee: Diabetes Care, Volume 23(2).February 2000.181-186

More Related