【24h】

Quantized Minimum Error Entropy Criterion

机译:量化最小误差熵标准

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Comparing with traditional learning criteria, such as mean square error, the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyi's entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning. The computational complexity of IP is, however, quadratic in terms of sample number due to double summation. This creates the computational bottlenecks, especially for large-scale data sets. To address this problem, in this paper, we propose an efficient quantization approach to reduce the computational burden of IP, which decreases the complexity from O(N-2) to O(MN) with M N. The new learning criterion is called the quantized MEE (QMEE). Some basic properties of QMEE are presented. Illustrative examples with linear-in-parameter models are provided to verify the excellent performance of QMEE.
机译:与传统学习标准相比,例如均方误差,最小误差熵(MEE)标准在非线性和非高斯信号处理和机器学习中优越。 Renyi熵估计器的对数的参数称为信息潜力(IP),是信息理论学习中的流行费用。然而,IP的计算复杂性是由于双重求和而在样本号方面二次。这创建了计算瓶颈,特别是对于大规模数据集。为了解决这个问题,在本文中,我们提出了一种有效的量化方法来降低IP的计算负担,这将来自o(n-2)到o(mn)的复杂性降低m n.新的学习标准是称为量化的mee(qmee)。提出了QMEE的一些基本属性。提供了具有线性参数模型的说明性示例,以验证QMEE的优异性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号