In this work, parametric measures of information (information matrices) are defined from the nonparametric information measures called (h, phi ) - diver-gences. Asymptotic distributions for information matrices are obtained when the parameter is replaced by its maximum likelihood estimator. On the basis of these results, tests of hypotheses are constructed.
展开▼