首页> 外文期刊>Expert systems with applications >Fully adaptive dictionary for online correntropy kernel learning using proximal methods
【24h】

Fully adaptive dictionary for online correntropy kernel learning using proximal methods

机译:使用近端方法的在线固定内核学习的完全自适应词典

获取原文
获取原文并翻译 | 示例

摘要

We introduce a new sparse variant of the Correntropy Kernel Learning model, hereafter named Fully ADaptive Online Sparse CKL (FADOS-CKL), for online system identification in the presence of outliers. For this purpose, we develop a fully adaptive dictionary of support vectors (SVs) so that it can either grow (as most of the kernel models to date do) or shrink if some of the SVs become obsolete with time. For inclusion of SVs into the dictionary, existing strategies (ALD, Novelty, Surprise, and Coherence) have their performances compared in this paper, while for elimination of SVs we adopt a class of optimization techniques known as proximal methods. Dictionary updating in FADOS-CKL is carried out on-the-fly by the introduction of a recursive methodology based on the Sherman-Morrison-Woodbury formula to update the kernel matrix and its inverse with low computational complexity. Aiming at achieving the smallest predictive errors with the highest sparsity level, a comprehensive performance comparison involving the FADOS-CKL model and powerful alternatives is carried out using two large-scale benchmark datasets for different levels of outliers contamination. The results indicate an impressive balance between reduction in the dictionary size and the corresponding generalization capability of the proposed FADOS-CKL model over the existing alternatives.
机译:我们引入Correntropy核学习模型的新变种稀疏,以下全名为自适应在线稀疏CKL(FADOS-CKL),在异常的情况下在线系统识别。为此,我们开发支持向量(SV)上的完全自适应字典,以便它可以增长(因为大多数内核模式迄今做的)或缩小,如果一些SV的过时随着时间的推移。列入的SV到字典,现有的战略(ALD,新奇,惊喜,和连贯性)在本文比较了他们的表演,而消除SV中我们采用的一类被称为近端方法的优化技术。在FADOS-CKL词典更新是通过引入基于所述谢尔曼-莫里森 - Woodbury的公式来更新内核矩阵及其具有低计算复杂度逆递归方法的情况下进行上即时。旨在实现最高层次的稀疏性最小的预测误差,涉及FADOS-CKL模型和强大的替代品的综合性能比较是使用了两次大规模的基准数据集进行不同程度的异常值污染。结果表明减少字典大小,在现有方案提出的FADOS-CKL模型对应的泛化能力之间的平衡令人印象深刻。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号