1 / 14

Sources of error in etched track measurements

Sources of error in etched track measurements. – or – why aren’t my results as good as yours when we’re using the same detectors? Fero Ibrahimi. Accuracy & Precision Passive radon detectors. Assessing Accuracy Reference value / Calibration - Radon Chamber / Box

bowen
Download Presentation

Sources of error in etched track measurements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sources of error in etched track measurements – or – why aren’t my results as good as yours whenwe’re using the same detectors? Fero Ibrahimi

  2. Accuracy & PrecisionPassive radon detectors • Assessing Accuracy Reference value / Calibration - Radon Chamber / Box Chamber / Box Instrumentation – itself calibrated / intercomparison ! Assessing Precision Standard error of the mean (SEM) (s / n) Standard deviation (SD) Assessing Both Internal control - Blind test / dummy customer External control – Certification Assurance / Proficiency Test / Validation Scheme External control - Intercomparison Exercises

  3. IntercomparisonsNRPB / HPA Annual Exercise

  4. IntercomparisonsNRPB / HPA Annual Exercise • Since 1997 • 40 passive detectors • 10 x transits – subtracted from exposures • 10 x ‘low’ radon exposure ~ 0.1 - 0.2 MBq m-3 h • 10 x ‘UK action level’ exposure ~ 0.2 – 1.0 MBq m-3 h • 10 x ‘high’ exposure ~ 1.0 – 2.0 MBq m-3 h • 3 different radon exposures & equilibrium factors (F) • For each exposure set • (Net) Absolute % Difference • % Standard Deviation • For all 3 exposure sets • Mean % Difference • Mean % Standard Deviation • Sum • Rank results • Grade ‘A’ < 10%

  5. How accurate can I hope to be?NRPB / HPA intercomparisons

  6. How precise can I hope to be?NRPB / HPA intercomparisons

  7. Sources of Measurement Uncertainty • Radon calibration reference value • Radon-222 source  3.1% at 2 sigma (95%) Confidence Level - PTB • HPA Radon Chamber - minimum  5.3% • Laboratory • Etching equipment • Counting system - Track Recognition, Focus, Scratches • Track overlap – calibration curve correction • Ageing / Fading effects • Seasonal / temperature corrections (Miles, 2001)

  8. Other sources of error • Passive detectors Diffusion cups / casings – Rn-220 (Tn) Etched track material / polymer: • chemicals – monomer, initiator, plasticiser • cure cycle variation • Etching chemicals • Laboratory • Personnel

  9. Track overlapCalibration curve corrections Counting whole etched tracks Counting foreground pixels (px)

  10. Ageing / Fading EffectsHardcastle & Miles (1996) Combined ageing & fading correction factor = 0.0007M2 + 0.0142M + 0.9528

  11. HPA Ageing & Fading Correction Factors

  12. Why aren’t your results as good as somebody else’s when you’re using the same detectors? • What Quality Assurance checks are you doing? • Personnel – adequate + continued training / support • Radon Chamber / Box instrumentation calibration / intercomparison • Detector calibration - material sensitivity & background – HPA each sheet • Etch System – HPA every time • Count System– HPA every time • Track overlap – calibration linearity • Ageing & Fading effects • Seasonal / temperature effects on annual average concentration

  13. Why aren’t your results as good as somebody else’s when you’re using the same detectors?continued • How often should you assess your measurement system? • Minimum: internal blind test – every 6 months ? • Better: internal blind test – every batch of etch track material • Even better: external proficiency / intercomparison test – 3 diff exposures • Best: all of the above! • Any Questions / Comments ?

  14. Passive detectors in NRPB / HPA Intercomparisons

More Related