首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Dependent Online Kernel Learning With Constant Number of Random Fourier Features
【24h】

Dependent Online Kernel Learning With Constant Number of Random Fourier Features

机译:具有恒定数量的随机傅里叶特征的依赖在线内核学习

获取原文
获取原文并翻译 | 示例
       

摘要

Traditional online kernel learning analysis assumes independently identically distributed (i.i.d.) about the training sequence. Recent studies reveal that when the loss function is smooth and strongly convex, given i.i.d. training instances, a constant sampling complexity of random Fourier features is sufficient to ensure convergence rate of excess risk, which is optimal in online kernel learning up to a factor. However, the i.i.d. hypothesis is too strong in practice, which greatly impairs their value. In this paper, we study the sampling complexity of random Fourier features in online kernel learning under non-i.i.d. assumptions. We prove that the sampling complexity under non-i.i.d. settings is also constant, but the convergence rate of excess risk is , where is the mixing coefficient measuring the extent of non-i.i.d. of training sequence. We conduct experiments both on artificial and real large-scale data sets to verify our theories.
机译:传统的在线内核学习分析假设关于训练序列的分布是相同的(i.i.d.)。最新研究表明,当损失函数为光滑且强凸时,给定i.d.在训练实例中,随机傅里叶特征的恒定采样复杂度足以确保额外风险的收敛速度,这在在线内核学习中达到最佳是一个因素。但是,i.d。假设在实践中太强了,这大大削弱了它们的价值。在本文中,我们研究了非i.i.d下在线内核学习中随机傅里叶特征的采样复杂性。假设。我们证明了非i.d.设置也是恒定的,但是超额风险的收敛速度是,其中衡量非i.d.程度的混合系数是。训练顺序。我们在人工和真实大规模数据集上进行实验以验证我们的理论。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号