【24h】

Asynchronous Doubly Stochastic Sparse Kernel Learning

机译:异步双随机稀疏的内核学习

获取原文

摘要

Kernel methods have achieved tremendous success in the past two decades. In the current big data era, data collection has grown tremendously. However, existing kernel methods are not scalable enough both at the training and predicting steps. To address this challenge, in this paper, we first introduce a general sparse kernel learning formulation based on the random feature approximation, where the loss functions are possibly non-convex. Then we propose a new asynchronous parallel doubly stochastic algorithm for large scale sparse kernel learning (AsyDSSKL). To the best our knowledge, AsyDSSKL is the first algorithm with the techniques of asynchronous parallel computation and doubly stochastic optimization. We also provide a comprehensive convergence guarantee to AsyDSSKL. Importantly, the experimental results on various large-scale real-world datasets show that, our AsyDSSKL method has the significant superiority on the computational efficiency at the training and predicting steps over the existing kernel methods.
机译:内核方法在过去二十年中取得了巨大的成功。在目前的大数据时代,数据收集已经大幅增加。然而,现有的内核方法在训练和预测步骤中都不足够可扩展。为了解决这一挑战,本文首先引入基于随机特征近似的一般稀疏内核学习制定,其中损耗功能可能是非凸的。然后我们提出了一种新的异步并行双随机算法,用于大规模稀疏内核学习(ASYDSSKL)。据我们所知,Asydsskl是第一种具有异步并行计算和双随机优化技术的算法。我们还为Asydsskl提供了全面的收敛保证。重要的是,各种大型现实世界数据集的实验结果表明,我们的ASYDSSKL方法对训练的计算效率具有显着的优势,并预测现有内核方法的步骤。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号