High-resolution gamma spectroscopy has been employed for the measurement of ~(134)Cs/~(137)Cs, ~(154)Eu/~(137)Cs and ~(134)Cs/~(154)Eu gamma intensity ratios from spent fuel with the purpose of deriving pin-averaged single-ratio burnup indicators for high and ultra-high burnups. Two UO_2 pressurised water reactor (PWR) fuel rod segments with record burnup levels >80 GWd/t have been experimentally characterised. Additionally, pin cell depletion calculations have been performed for each sample with the deterministic code CASMO-4, using both its JEF2.2- and its ENDF/B-IV-based libraries, for three different descriptions of the fuel rod irradiation histories, in order to test the sensitivity of the results to neutron cross sections and to the depletion model employed. Measured and calculated ratios have then been compared. It is shown that the ~(134)Cs/~(137)Cs ratio, frequently used as burnup monitor, is considerably less accurate for values exceeding 50 GWd/t; discrepancies of up to ~25% are found between measured and calculated values. The ratios built with the ~(154)Eu concentration show much larger discrepancies, essentially because this isotope is rather poorly predicted as revealed by the use of different basic cross section data.
展开▼