The interaction of TeV photons from blazars with the extragalactic background light produces a relativistic beam of electron-positron pairs streaming through the intergalactic medium (IGM). The fate of the beam energy is uncertain. By means of two- and three-dimensional particle-in-cell simulations, we study the nonlinear evolution of dilute ultra-relativistic pair beams propagating through the IGM. We explore a wide range of beam Lorentz factors γ b 1 and beam-to-plasma density ratios α 1, so that our results can be extrapolated to the extreme parameters of blazar-induced beams (γ b ~ 106 and α ~ 10–15, for powerful blazars). For cold beams, we show that the oblique instability governs the early stages of evolution, but its exponential growth terminates—due to self-heating of the beam in the transverse direction—when only a negligible fraction ~(α/γ b )1/3 ~ 10–7 of the beam energy has been transferred to the IGM plasma. Further relaxation of the beam proceeds through quasi-longitudinal modes, until the momentum dispersion in the direction of propagation saturates at Δp b, ∥/γ b mec ~ 0.2. This corresponds to a fraction ~10% of the beam energy—irrespective of γ b or α—being ultimately transferred to the IGM plasma (as compared to the heating efficiency of ~50% predicted by one-dimensional models, which cannot properly account for the transverse broadening of the beam). For the warm beams generated by TeV blazars, the development of the longitudinal relaxation is suppressed, since the initial dispersion in beam momentum is already Δp b0, ∥/γ b mec 1. Here, the fraction of beam energy ultimately deposited into the IGM is only ~α γ b ~ 10–9. It follows that most of the beam energy is still available to power the GeV emission produced by inverse Compton up-scattering of the cosmic microwave background by the beam pairs.
展开▼