In this short paper I present the results of a calculation that seeks the maximum, or optimal, signal-to-noise energy band for galaxy group or cluster X-ray emission detected by the Chandra and XMM-Newton observatories. Using a background spectrum derived from observations and a grid of models, I show that the "classical" 0.5-2 keV band is indeed close to optimal for clusters with gas temperatures greater than 2 keV and redshifts z 1. For cooler systems, however, this band is generally far from optimal. Sub-keV plasmas can suffer 20%-60% signal-to-noise loss, compared to an optimal band, and worse for z 0. The implication is that current and forthcoming surveys should be carefully constructed in order to minimize bias against the low-mass, low-temperature end of the cluster/group population.
展开▼