We show that the observed mass-to-light (M/L) ratio of galaxy clusters increases with cluster temperature as expected from cosmological simulations. Contrary to previous observational suggestions, we find a mild but robust increase of M/L from poor (T ~ 1-2 keV) to rich (T ~ 12 keV) clusters; over this range, the mean M/L_V increases by a factor of about 2. The best-fit relation satisfies M/L_V = (170 +- 30)T_(keV)~(0.3+-0.1) h at z = 0, with a large scatter. This trend confirms predictions from cosmological simulations that show that the richest clusters are antibiased, with a higher ratio of mass per unit light than average. The antibias increases with cluster temperature. The effect is caused by the relatively older age of the high-density clusters, where light has declined more significantly than average since their earlier formation time. Combining the current observations with simulations, we find a global value of M/L_V-240 +- 50 h and a corresponding mass density of the universe of Ω_m= 0.17 +- 0.05.
展开▼