首页> 外文会议>International Conference on Algorithmic Learning Theory >Learning Kernel Perceptrons on Noisy Data Using Random Projections
【24h】

Learning Kernel Perceptrons on Noisy Data Using Random Projections

机译:使用随机预测学习在嘈杂数据上的内核意识形

获取原文

摘要

In this paper, we address the issue of learning nonlinearly separable concepts with a kernel classifier in the situation where the data at hand are altered by a uniform classification noise. Our proposed approach relies on the combination of the technique of random or deterministic projections with a classification noise tolerant perceptron learning algorithm that assumes distributions defined over finite-dimensional spaces. Provided a sufficient separation margin characterizes the problem, this strategy makes it possible to envision the learning from a noisy distribution in any separable Hilbert space, regardless of its dimension; learning with any appropriate Mercer kernel is therefore possible. We prove that the required sample complexity and running time of our algorithm is polynomial in the classical PAC learning parameters. Numerical simulations on toy datasets and on data from the UCI repository support the validity of our approach.
机译:在本文中,我们在通过统一的分类噪声改变手中改变的情况下,通过内核分类器来解决学习非线性可分离概念的问题。我们所提出的方法依赖于随机或确定性投影技术的组合,其中具有分类噪声容忍的Perceptron学习算法,该噪声学习算法假设在有限维空间上定义的分布。提供了足够的分离余量的特征,这种策略使得可以以任何可分离的希尔伯特空间中的嘈杂分布设想学习,无论其尺寸如何;因此,可以使用任何适当的Mercer内核进行学习。我们证明了我们算法的所需样本复杂性和运行时间是在经典PAC学习参数中的多项式。玩具数据集的数值模拟以及UCI存储库的数据支持我们方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号