【24h】

The Kernel Recursive Generalized Cauchy Kernel Loss Algorithm

机译:内核递归广义CAUCHY核损耗算法

获取原文

摘要

As a nonlinear measure similarity developed in a kernel space, the generalized correntropic loss (GC-Loss) has been successfully used for signal processing and machine learning in non-Gaussian situations thanks to its ability of extracting high-order statistical properties of data. However, a highly non-convex performance surface in GC-Loss leads to poor optimization performance. To address this issue, we propose a new similarity measure developed in the kernel space by combining a generalized Gaussian distributed kernel function into the Cauchy loss. By minimizing the proposed loss function and using the kernel method, a novel kernel recursive generalized Cauchy kernel loss (KRGCKL) algorithm in the reproducing kernel Hilbert space (RKHS) is proposed. Simulations on different examples under non-Gaussian noises show the superiorities of KRGCKL over other representative algorithms in terms of filtering accuracy and robustness to large outliers.
机译:由于其在内核空间中开发的非线性测量相似性,由于提取了数据的高阶统计特性的能力,已经成功地用于非高斯情况下的信号处理和机器学习。 然而,GC损耗中的高度非凸性能表面导致优化性能差。 为了解决这个问题,我们提出了通过将广义高斯分布式内核功能组合到Cauchy损失中的内核空间中开发的新相似度措施。 通过最小化所提出的损失函数并使用内核方法,提出了一种新的核心递归广义Cauchy核损耗(RKHS)中的再现内核空间(RKHS)。 在非高斯噪声下的不同示例模拟,在滤波精度和鲁棒性对大型异常值方面,在其他代表性算法中显示了KRGCKL的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号