Most independent component analysis (ICA) algorithms usemutual information (MI) measures based on Shannon entropy as a costfunction, but Shannon entropy is not the only measure in theliterature. In this paper, instead of Shannon entropy, Tsallisentropy is used and a novel ICA algorithm, which uses kernel densityestimation (KDE) for estimation of source distributions, is proposed.KDE is directly evaluated from the original data samples, so it solvesthe important problem in ICA: how to choose nonlinear functions as theprobability density function (pdf) estimation of the sources.
展开▼