首页> 外文期刊>Neurocomputing >Faster constrained linear regression via two-step preconditioning
【24h】

Faster constrained linear regression via two-step preconditioning

机译:通过两步预处理更快地进行约束线性回归

获取原文
获取原文并翻译 | 示例
       

摘要

In this paper, we study the large scale constrained linear regression problem and propose a two-step preconditioning method, which is based on some recent developments on random projection, sketching techniques and convex optimization methods. Combining the method with (accelerated) mini-batch SGD, we can achieve an approximate solution with a time complexity lower than that of the state-of-the-art techniques for the low precision case. Our idea can also be extended to the high precision case, which gives an alternative implementation to the Iterative Hessian Sketch (IHS) method with significantly improved time complexity. Experiments on benchmark and synthetic datasets suggest that our methods indeed outperform existing ones considerably in both the low and high precision cases. (C) 2019 Elsevier B.V. All rights reserved.
机译:本文研究大规模约束线性回归问题,并基于随机投影,素描技术和凸优化方法的最新发展,提出了两步预处理方法。将该方法与(加速的)小批量SGD结合使用,我们可以实现一种近似解决方案,其时间复杂度低于低精度情况下的最新技术。我们的想法也可以扩展到高精度情况,这为迭代黑森州草图(IHS)方法提供了另一种实现方式,并显着改善了时间复杂度。在基准数据集和综合数据集上进行的实验表明,在低精度和高精度情况下,我们的方法的确确实优于现有方法。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号