Phenomenon Involving the Change in Neutron Survey Meter Efficiency Versus Accumulated Dose
R.K. Piper1; M.K. Murphy1; and A.K. Thompson2 (1Battelle-PNNL; 2NIST)
Some individuals in the nuclear industry, including operators of the calibration laboratories at the National Institute of Standards and Technology (NIST) and the Pacific Northwest National Laboratory (PNNL), have observed a phenomenon in some neutron survey instruments that involves a change in detector efficiency as the integrated dose increases. This effect appears to be limited to proportional counter-based neutron detectors (e.g., BF3 and 3He detectors), and can get as high as 15-20% before stabilizing. The amplitude of the effect appears to have no correlation with detector model and, in fact, can vary widely for units of the same model. The elevated response eventually returns to the "baseline" response condition, but the time to return also appears to be inconsistent for a given detector (hours to days). This effect can easily go unnoticed by the calibrator, and has the potential for significant calibration errors for the detector being calibrated (i.e., when used as a survey instrument), as well as on the reference fields being calibrated using the detector (i.e., when used as a transfer standard). Data obtained at both PNNL and NIST, the potential influence upon neutron calibrations, and recommended methods for compensating for this effect are presented.