1 / 37

Biostatistics and Research Design, #6350

Biostatistics and Research Design, #6350. Screening, Diagnostic Accuracy (sensitivity & specificity). Thought for the Day:. “…The arts and sciences, and a thousand appliances…, but the wind that blows is all that anybody knows” Henry David Thoreau. Learning Concepts: Screening.

clodia
Download Presentation

Biostatistics and Research Design, #6350

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Biostatistics and Research Design, #6350 Screening, Diagnostic Accuracy(sensitivity & specificity) Sensitivity, Specificity

  2. Thought for the Day: “…The arts and sciences, and a thousand appliances…, but the wind that blows is all that anybody knows” Henry David Thoreau Sensitivity, Specificity

  3. Learning Concepts: Screening • Understand sensitivity and specificity of a diagnostic test • Be able to calculate sensitivity and specificity • Be able to use the concepts of sensitivity and specificity in clinical decision making NB: from Latin: nota bene; means “good note” Sensitivity, Specificity

  4. Why Screen for Disease? • Early detection --> Early treatment • Access into health care system • Not everyone gets routine care • Community service • Practice builder Sensitivity, Specificity

  5. Why Screen? (early diagnosis) Detection by screening test Biologic onset of disease Disease detectable by routine methods Morbidity (death) Disease detectable by screening test Sensitivity, Specificity

  6. What a screening is / What a screening is not • IS: An indication of a problem: • Cost-effective • Rapid • IS NOT: Completely diagnostic: • Over-referrals and under-referrals • Not 100% accurate • Not a substitute for regular health care Sensitivity, Specificity

  7. Special Notes: For a screening to be effective: • Need a system in place to handle referrals • The condition being screened for must be treatable Sensitivity, Specificity

  8. Screening Programs at SCCO:School Screening • CA state law since 1947 • 1st, 3rd, and 6th grades • 1971 minimum intervals for conducting a screening • Only legally mandated tests: • Snellen visual acuity • Color vision testing for boys Sensitivity, Specificity

  9. Screening Programs at SCCO:Special Events • Save Your Vision Week • Back to School Open House • Community screening programs: • Regular school screenings • Senior centers (IOP) • Special Olympics Sensitivity, Specificity

  10. The Orinda Study (overview; more later) • Screened children in grades 1 - 8 • Ages 5 - 13 • Total of 1,163 children screened • Goals: To design the least expensive, least technical and most effective screening program Sensitivity, Specificity

  11. The Orinda Study Modified Clinical Technique (MCT) consists of: • visual acuity • retinoscopy • cover test • color vision • ophthalmoscopy Sensitivity, Specificity

  12. The Orinda Study • Results:The Modified Clinical Technique (MCT) is effective in identifying more than 90% of those with vision problems • Test Positive, Disease Positive; sensitivity = 90% Sensitivity, Specificity

  13. The Orinda Study • What about inclusion of other tests? • Why or why not: • visual field • tonometry • subjective refraction • blood pressure Sensitivity, Specificity

  14. Efficacy of Diagnostic Tests or Methods: General • As clinicians (or researchers), we need to use tests that detect the disease or condition well, while properly classifying those without the condition • NB: can’t use the test under consideration to assign as affected or normal • Concepts: > 3: • Sensitivity • Specificity • Receiver Operating Characteristic (ROC) Sensitivity, Specificity

  15. Sensitivity • Accuracy of the screening procedure to correctly identify all individuals in a population who have a particular disorder • NB: Newer terminology = Detection Rate Sensitivity, Specificity

  16. Sensitivity, Specificity

  17. Sensitivity • Out of all of the people who have the disorder, how many does your screening test correctly identify? • True positives • Mnemonic? • “Test positive, disease positive” • “Sensitive to disease” Sensitivity, Specificity

  18. Basic Setup for a 2  2 Contingency Table*(sensitivity and specificity) * Also called a “confusion matrix” (Wikipedia, 2010 Onward) Sensitivity, Specificity

  19. Sensitivity: Example • Van Bjisterveld OP, Diagnostic Tests in the Sicca Syndrome. Arch Ophthalmol; 82:10-14, 1969 • Rose bengal staining • 550 normals • 43 dry eye patients • NB:Both eyes included, but this artificially inflates the statistical significance since the eyes are not independent for this condition • Also, limited information as to how the “drys” were classified Sensitivity, Specificity

  20. Rose Bengal Staining Sensitivity, Specificity

  21. Sensitivity: Example, Rose Bengal* * Using cut-off value of 3.5 out of 9 possible Sensitivity = 0.95 (82/86) Sensitivity, Specificity

  22. Specificity • Accuracy of the screening procedure to correctly identify those who do not have the disorder • Mnemonics? • “Test negative, disease negative” • “Specific to health” Sensitivity, Specificity

  23. Specificity • Out of all of those who do not have the disorder, how many does your screening correctly identify? • True negatives • Implication: • if specificity = 0.90, 10% of normals will be referred for care • if specificity = 0.94, 6% of normals will be referred for care, etc. Sensitivity, Specificity

  24. Specificity: Example, Rose Bengal* * Using cut-off value of 3.5 out of 9 possible Specificity = 0.96 (1060/1100) Sensitivity, Specificity

  25. Sensitivity and Specificity • Generally inversely related • Cannot usually have 100% for both (but you CAN maximize both, as we have just observed) Sensitivity, Specificity

  26. Sensitive tests: few false-negatives for serious but treatable conditions test people without complaints Specific tests: few false-positives for conditions with serious misdiagnosis consequences confirm a suspected diagnosis Comparison of Specific and Sensitive tests Sensitivity, Specificity

  27. Example*: Test Needs to be Specific and Sensitive • HIV/AIDS: screening test = detect antibodies to virus (ELISA assay) • Sensitivity: 72/74 who were HIV positive; (97% sensitivity) • Inappropriate reassurance to an infected person: • Delays treatment, increases spread of virus? • Specificity: 257/261 healthy persons (98% specificity) • News of infection could be devastating to a healthy individual *Greenberg, RS, et al. Medical Epidemiology, 3rd Ed., pp. 7-8,  2110, McGraw Hill, New York. Sensitivity, Specificity

  28. ROC Curves • Background: developed during WW II for radar: how to best detect enemy aircraft • Plot: true positives and false positives (1 – specificity) • Can use differing tests or combinations of several tests to provide the largest AUC • Close to 1.00 is best • Bottom Line: Another test metric Sensitivity, Specificity

  29. ROC Curves: Example: Tear Film Thickness to Dx Dry eye Maximum Sensitivity for DE (0.86) and Specificity (0.94)if tear thickness < 2.75 micrometers Sensitivity, Specificity

  30. Diagnostic Tests for MGD: Paugh’s Pearls

  31. Guidelines for Selecting a Diagnostic Test • Has there been an independent masked comparison with a gold standard of diagnosis? (e.g., an autorefractor vs. subjective) • Has the diagnostic test been evaluated in a patient sample that included an appropriate spectrum of mild and severe, treated and untreated disease, plus individuals with different but similar disorders? Sensitivity, Specificity

  32. Concepts in Action: The Vision in Preschoolers (VIP) Study • Preschool screening is a major policy issue at the state and national level • Screenings mandated in most states, some even comprehensive eye exams for kids • e.g., Kentucky  Determine the best methods to screen for major eye conditions by nurses and lay personnel Sensitivity, Specificity

  33. VIP: Details, Phase I* • Phase I: ODs and OMDs screened Headstart children vs. comprehensive eye exam (over represent vision problems) • 4 major conditions: amblyopia, strabismus, sig. ref. error and unexplained VA loss • Goal: compare 11 screening tests vs. exam: which are most sensitive? • Strategy: set specificity at 90% (10% over-referral), what is sensitivity of screeners? * VIP Study Group, Ophthalmol 2004;111:637-650 Sensitivity, Specificity

  34. VIP (LEPs): Phase I Results • Best overall sensitivity: • Ref Error: Non-cycloplegic retinoscopy = 63% • Ref Error: SureSight screener = 63% • Ref Error: Retinomax screener = 63% • VA: Lea symbols test = 61% • Sensitivity of most important to detect: • refractive error: severe anisom., hyperopia > 5D, astig. > 2.5D, myopia > 6D: 80-90% Sensitivity, Specificity

  35. VIP Phase II: Nurses vs. Lay People* • n = 1452 total Headstart preschoolers: • Age:  3 to < 5 yrs • n = 990 normals • n = 462 with vision conditions • All preschoolers had gold std. exams • Used best automated refractors from Phase I:(Retinomax, Suresight) plus Lea Symbols (VA) and Stereo Smile II (stereo acuity) • Specificity set at 0.90 (10% over-referral) * VIP Study Group, IOVS 2005;46:2639-2648 Sensitivity, Specificity

  36. VIP Phase II: Results • Overall, nurse screeners had slightly higher sensitivities, but not statistically significant • Also: both groups similar to licensed docs • E.g., for Group I (most severe conditions): • Autorefractors: • Nurses: sensitivity = 0.83 - 0.88 • Lay: sensitivity = 0.82 – 0.85 • Stereo Smile II: • Nurses: sensitivity = 0.58 • Lay: sensitivity = 0.56 Sensitivity, Specificity

  37. “Whether a test should be used or not…” Sensitivity, Specificity

More Related