Answer to Question #8263 Submitted to "Ask the Experts"

Category: Instrumentation and Measurements — Surveys and Measurements (SM)

The following question was answered by an expert in the appropriate field:


I have a Victoreen 451P pressurized ion chamber survey meter. I also have a BOT Engineering Company Ltd. energy-compensated Geiger Mueller (GM) detector. When I take the dose rate reading from an 131I package (with 5.55 x 10Bq inside), the dose rate at 1 meter is similar on both instruments. However, the surface reading is quite different. The GM is double that of the ion chamber at the surface. I understand that the GM will overestimate the dose of low-energy gammas. Is 131I at 364 keV truly a "low" energy gamma? What is going on here?


Assuming that both detectors were properly calibrated, and that the GM tube was, indeed, energy-compensated, I do not believe the difference you describe is associated with any energy dependence. What you describe is, I believe, a rather classic case of detector geometry affecting the reading obtained, and the problem has some rather disturbing implications in certain practical situations, not the least of which is in monitoring of packages to ensure compliance with transportation regulations.

When a detector is calibrated to measure gamma-exposure rate or dose rate, the radiation field is set up so as to be uniform over the dimensions of the detector. This condition is relatively easy to achieve if the detector is far enough away from the source so that the change in dose rate with distance is small. As the detector is moved close to the source, especially if the source dimensions are relatively small, the gamma field over the volume of the detector may become less uniform, and the reading on the detector reflects the average response throughout the volume and not the actual dose rate at a fixed point. In general, the larger the detector dimensions are, the less uniform will be the field over the detector volume and the greater disparity there will be between the reading and the actual dose rate at a point, such as a point at the center of the detector volume or at the surface of a package as in this case. 

The two detector types you describe are probably quite different in geometry. The 451P detector is a relatively large cylindrical chamber of about 250 cm3 volume, and it is contained in a closed case. I don't know which specific BOT GM detector you are using, but many of BOTs' energy-compensated tubes have diameters of about 3.5 cm. If such a detector were placed at the surface of a package with its longitudinal axis parallel to the package surface, the center of the detector would be at about 1.8 cm from the surface. I do not have the exact dimensions of the 451P chamber but, given its volume, the depth of such a chamber is likely between about 5 cm and 8 cm, and the effective center of the volume would be, at minimum, between 2.5 cm and 4 cm from the package surface. These dimensional differences in the two detectors can explain the fact that the two detectors read differently when placed in contact with the package surface. This can be a source of some concern and consternation, especially when the dose rate near the package surface is greater than the allowed limit. In such an event, one worker releasing a package for shipment may use a detector that yields a reading less than the limit while a worker surveying the same package upon receipt may use a detector of smaller dimensions and obtain a reading above the allowed limit. When the detectors are 1 meter from the package surface, the readings are about the same because at this distance the radiation field over the detector volumes are quite uniform.

In conclusion, the smaller the detector dimensions, especially in the direction of the normal line connecting the source to the detector wall of radiation incidence, the closer will be the measured dose rate to the actual dose rate at the package surface. As a pertinent aside, it is my own feeling that the regulating agencies should not specify transportation limits in terms of dose rate at the surface of the package, precisely because such a number cannot be measured reliably with a detector of finite dimensions, and workers using different instruments will likely measure different "surface dose rates." Nor does the surface dose rate likely reflect the real level of exposure to individuals handling a package. Specification of dose rate at a distance such as 10 cm from the surface (measured to the center of the detector volume) would, I believe, provide for more consistency in measurements and eliminate some of the current practical measurement concerns without sacrificing radiation safety. As of now, regulating agencies have not accepted this view, and I am afraid you have to deal with the uncertainty, likely accepting the higher reading on the GM detector as closer to the truth. Generally, the only time you might have a problem of legal significance is when a reading is close to an allowed limit. Then the question as to whether you are actually measuring the surface dose rate may become relevant. Good luck.

George Chabot, PhD, CHP

Answer posted on 23 April 2009. The information posted on this web page is intended as general reference information only. Specific facts and circumstances may affect the applicability of concepts, materials, and information described herein. The information provided is not a substitute for professional advice and should not be relied upon in the absence of such professional advice. To the best of our knowledge, answers are correct at the time they are posted. Be advised that over time, requirements could change, new data could be made available, and Internet links could change, affecting the correctness of the answers. Answers are the professional opinions of the expert responding to each question; they do not necessarily represent the position of the Health Physics Society.