Fundamental relations between information and estimation have been established in the literature for the scalar Gaussian and Poisson channels. In this work, we demonstrate that such relations hold for a much larger class of observation models. We introduce the natural family of scalar Lévy channels where the distribution of the output conditioned on the input is infinitely divisible. For Lévy channels, we establish new representations relating the mutual information between the channel input and output to an optimal estimation loss, thereby unifying and considerably extending results from the Gaussian and Poissonian settings. We demonstrate the richness of our results by working out two examples of Lévy channels, namely the Gamma channel and the Negative Binomial channel, with corresponding relations between information and estimation. Extensions to the setting of mismatched estimation are also presented.
展开▼