In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (Expectation-Maximization) algorithm yields the maximum-likelihood solution it has many problems: (ⅰ) it requires a careful initialization of the parameters; (ⅱ) the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model, and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture. We test this method with synthetic and real data and compare the results with those obtained with the classical EM with a fixed number of kernels.
展开▼