1 / 27

NIST’s Support of Rad/Nuc Standards Development

NIST’s Support of Rad/Nuc Standards Development. Dr. Leticia Pibida Physicist National Institute of Standards and Technology leticia.pibida@nist.gov. Introduction. NIST worked with the Department of Homeland Security (DHS) Science and Technology (S&T) Directorate Standards in:

xylia
Download Presentation

NIST’s Support of Rad/Nuc Standards Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NIST’s Support of Rad/Nuc Standards Development Dr. Leticia Pibida Physicist National Institute of Standards and Technology leticia.pibida@nist.gov

  2. Introduction • NIST worked with the Department of Homeland Security (DHS) Science and Technology (S&T) Directorate Standards in: • Development of consensus standards • Development of test and evaluation protocols for testing laboratories • Report of test results from testing against ANSI standards • Validation of ANSI standards • System integration efforts • NIST works with the Department of Homeland Security (DHS) Domestic Nuclear Detection Office (DNDO) in: • Development of government standards • Development of test designs for operational test-campaigns • Data analysis from test-campaigns results • Develop “Equipment List” for radiation detector users

  3. Standards Reviewed http://standards.ieee.org/getN42/index.html • ANSI N42.32 Chair: Joe McDonald • American National Standard Performance Criteria for Alarming Personal Radiation Detectors for Homeland Security • ANSI N42.33 Chair: Morgan Cox • American National Standard for Portable Radiation Detection Instrumentation for Homeland Security • ANSI N42.34 Chair: Peter Chiaro • American National Standard Performance Criteria for Hand-held Instruments for the Detection and Identification of Radionuclides • ANSI N42.35 Chair: Leticia Pibida/ Brian Rees • American National Standard for Evaluation and Performance of Radiation Detection Portal Monitors for Use in Homeland Security • Revisions published January 2007 • Include lessons learned from equipment testing • Development of data sheets for data acquisition at test labs

  4. Recently Published Standards • ANSI N42.37Chair: Morgan Cox/ Alex Boerner (April 2007) • Standard for Training Homeland Security Emergency Responders in the Uses and Maintenance of Radiation Detection Instruments • ANSI N42.38 Chair: Peter Chiaro (January 2007) • Standard for Spectroscopy-Based Portal Monitors used for Homeland Security • ANSI N42.42 Chair: George Lasche/ Leticia Pibida (March 2007) • Data format standard for radiation detectors used for Homeland Security • ANSI N42.43 Chair: Peter Chiaro (April 2007) • Standard for Mobile and Transportable Systems Including Cranes used for Homeland Security Applications

  5. Standards Under Development • ANSI N42.39Chair: Alan Thompson/ Joe McDonald • Standard for Performance Criteria for Neutron Detectors for Homeland Security • ANSI N42.41 Chair: David Gilliam • Performance Criteria for Active Interrogation Systems used for Homeland Security • ANSI N42.48Chair: Peter Chiaro • American National Standard Performance Requirements for Spectroscopic Personal Radiation Detectors (SPRDs) for Homeland Security • ANSI N42.49Chair: Morgan Cox/ Joe McDonald • American National Standard for Performance Criteria for Personal Emergency Radiation Detectors (PERDs) for Exposure Control

  6. Standards Under Development (cont.) • ANSI N42.44 Chair: Mike Barrientos (TSL/SRA) • Performance and evaluation of checkpoint cabinet x-ray imaging security-screening systems • ANSI N42.45 Chair: Lok Koo (TSL/SRA)/ Jim Connelly L3 Corporation • Evaluating the image quality of x-ray computed tomography security-screening systems • ANSI N42.46 Chair: Stacy Wright (TMEC)/ Jim Lamers (NTMI) • Measuring the performance of imagining x-ray and gamma-ray systems for cargo and vehicle security screening • ANSI N42.47 Chair: Frank Cerra • Measuring the Imaging performance of X-ray and Gamma-raySystems for Security Screening of Humans

  7. Proposed Standards • ANSI N42.XX Chair: Ed Groeber • Standard or appendage to existing standards with regards to the use of radiation detection instruments under extreme conditions • ANSI N42.XXChair: TBD • Performance criteria for handheld survey meter used in response and recovery for homeland security applications • ANSI/ASTMChair: TBD • Standards for sampling and forensics in radiological events

  8. International Standards Work Working group WG B15 • Radiation protection instrumentation – Highly sensitive hand-held instruments for detection of radioactive material • IEC 62327 - Radiation protection instrumentation - Hand-held instruments for the detection and identification of radionuclides and for the indication of ambient dose equivalent rate from photon radiation. • IEC 62484 - Radiation protection instrumentation - Spectroscopy-based portal monitors used for the detection and identification of illicit trafficking of radioactive material. • IEC 62401- Radiation protection instrumentation - Alarming Personal Radiation Devices (PRD) for detection of illicit trafficking of radioactive material

  9. Equipment Testing Against Standards DHS Task Management Report NIST Coordinate Testing Send Test Results for Review and Report Equipment Arrangements Sign Agreement Manufacturers Testing Labs Sent Equipment for Testing

  10. ANSI N42 Standards General Tests • Radiological tests: exposure rate, background, false alarm, gamma and neutron response, radionuclide identification (strongly depend on detector type) • Environmental tests: temperature, humidity, sealing (similar for all type of detectors) • Mechanical tests: mechanical shocks, vibration, drop test (strongly depend on detector type) • Electromagnetic tests: external magnetic fields, radio frequency, conducted disturbances (burst and radio frequencies), surges and oscillatory waves, electrostatic discharges (similar for all type of detectors)

  11. Equipment Testing Round 1 Labs involved: NIST, ORNL, PNNL, LLNL and LANL • Number of Companies= 28 • Number of PRDs (Pagers) = 18 • Number of Survey Meters = 23 (33 with probes) • Number of Radionuclide Identifiers = 7 • Number of Portals = 14 • Total number of instruments to test = 172 (202 with probes) Test Report Published in www.rkb.mipt.org Responder Knowledge Base (RKB)

  12. Implications of Round 1 Results and Report For DHS • This information could assist DHS in making informed procurement choices and make more effective use of grant money. • Improve ANSI/IEEE Standards’ requirements For Users • Compare test results for deployment of appropriate equipment For Manufacturers • Improve instrument performance to meet DHS needs

  13. Equipment Testing Round 2 Labs involved: NIST, ORNL and PNNL • Number of Companies= 11 • Number of PRDs (Pagers) = 16 • Number of Survey Meters = 9 • Number of Radionuclide Identifiers = 8 • Number of Portals = 3 • Total number of instruments to test = 105 • Test 2 Deadlines • April 25 equipment submission • April 29 update of T&E protocols and data sheets • May 2 start equipment testing • April 2006, Complete equipment testing

  14. Format of Report Summary Table – all instruments Pass / Fail / Conditional Table of Requirements: ANSI N42.3x + Testing and Evaluation Protocol • General • Radiological • Environmental • Electrical and Electromagnetic • Mechanical Detailed tables for each instrument

  15. Example: Test Results RIIDs Round 2

  16. Example: Test Results RIIDs Round 2

  17. Example: Test Results RIDs Round 2

  18. Equipment List for Radiation Detectors • NIST is working with DNDO to develop a system for testing commercially available equipment • The equipment that pass the required tests will be listed in an “Equipment List”to assist users during purchasing and deployment • Equipment testing will be split in 3 tiers • Tier 1: Testing against consensus standards (ANSI/IEEE standards) • Tier 2: Testing against government standards (to be developed by DNDO) • Tier 3: Scenario testing (testing performed by DNDO with user input) • Tier 1 testing will be carried out by NVLAP accredited laboratories • NVLAP Handbook final available at: http://ts.nist.gov/ts/htdocs/210/214/214.htm • For information on how to apply: http://222.nist.gov/nvlap • Tier 2 and Tier 3 testing could take place at assigned DNDO laboratories • Post market surveillance • Equipment testing • Supplier’s declaration of conformity

  19. Operational/Scenario Testing Efforts NIST is working with DNDO and users in the development of test designs and data analysis for operational/scenario test-campaigns Completed test campaigns: • Anole: included testing of RIIDs and Backpacks. Results available at www.rkb.mipt.org • Bobcat: included testing of Personal Radiation Detectors (Pagers Type). Results will be available soon Future test campaigns: • Crawdad: will include testing of equipment used in maritime applications

  20. Additional Testing at NIST • Validation of ANSI standards for radiological requirements for N42.32, N42.33, N42.34, N42.35, N42.38, N42.43 (for backpacks) • Will start validation of N42.49 standard Personal Radiation Dosimeters • Testing of portal monitor at NIST C-Gate • Develop new software for optimization of source detection for gross count systems • Radiation detection sensor integration with robots • DHS/NIST Sponsored Evaluation Exercise Maryland Task Force 1 Training Facility

  21. Robot/Sensor Testing: Lessons Learned and Needs • For sensor plug-and-play capability on robots need to: - Define wire connection in robot - Define communication protocol between sensor and robot (2-way communications) - Transmit ANSI N42.42 data format (XML) files to save spectra and data - Define display in robot control unit • Cameras are not very useful as sensor screens were not readable with sunlight • Need audible capability – Rad sensors need to meet 85 dB at 30cm • Sensor mounting capabilities will depend on mission type • High sensitivity instruments gross counting and/or ID capabilities

  22. Summary of N42.43 Backpack Testing • Sensitivity of present backpacks do not meet ANSI standard requirement for all tested parameters • Need to review ANSI standard requirements • Need to interact with manufacturers to ensure optimal system performance • Suggestions for Standard Revision: • For the radionuclides specified in the standard, testing Distance should be closer than 3m. • Only 180º tests are required to reduce the angular dependence testing time. • Neutron moderator thickness could be reduced (after complete study). • Phantom type (water, PMMA) and size might be specified in standard. • Alarm thresholds for the backpacks should be investigated as to meet the false alarm rate requirement. Users should be aware of instruments performance changes at difference alarm thresholds.

  23. Source Development and Calibration • Supply and calibration of gamma-ray and neutron sources to DOE labs, for use in equipment testing against ANSI N42.35 and N42.38. • Developed new 232Th (14 Ci) and 226Ra (8 Ci) sources for use in testing against N42.38. • Helped source manufacturers with design, calibration and development of new sources – Commercialized by AEA technology (QSA Global) provides private sector participation • Sources can be made available to users for instrument checking

  24. Source Activities (ANSI/IEEE N42.35) Source Activities (ANSI/IEEE N42.38)

  25. NCRP Work NIST was part of the National Council on Radiation Protection and Measurements (NCRP) working group in the Development of NCRP Commentary No. 19 • Key Elements of Preparing Emergency Response for Nuclear and Radiological Terrorism

  26. Present Gaps and Concerns • Standards for response and recovery efforts • Implementation of “Equipment List” program • Manufacturers need to provide a test result summary for each instrument shipped to user • Acceptance testing performed by users for equipment purchased before deployment • Users need to screen for dead on arrival instruments • Calibration and maintenance of equipment once in the fields • Users should have a mechanism to ensure that instrument performance is not degraded with use • Functionality tests for deployed instruments • Users should have a routine procedure to check for malfunctioning instruments • Develop system to ensure that software upgrades performed by manufacturers do not affect users • Users need to be aware of changes

  27. Conclusions • Lots of work still needs to be done • User input is critical so address existing needs • Coordination between different agencies and user communities is key for success • Dissemination of information to users about existing standards and technology is needed to ensure the deployment of appropriate instrumentation to match mission • Performance, operational and field testing results should be integrated to provide users will all the necessary information to evaluate and compare instrument priori to acquisition

More Related