首页> 外文期刊>Neurocomputing >On the iteration complexity analysis of Stochastic Primal-Dual Hybrid Gradient approach with high probability
【24h】

On the iteration complexity analysis of Stochastic Primal-Dual Hybrid Gradient approach with high probability

机译:高概率随机原始对偶混合梯度方法的迭代复杂度分析

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we propose a stochastic Primal-Dual Hybrid Gradient (PDHG) approach for solving a wide spectrum of regularized stochastic minimization problems, where the regularization term is composite with a linear function. It has been recognized that solving this kind of problem is challenging since the closed-form solution of the proximal mapping associated with the regularization term is not available due to the imposed linear composition, and the per-iteration cost of computing the full gradient of the expected objective function is extremely high when the number of input data samples is considerably large.
机译:在本文中,我们提出了一种随机的原始-对偶混合梯度(PDHG)方法来解决多种正则化的随机最小化问题,其中正则化项与线性函数复合。已经认识到解决这种问题是具有挑战性的,因为由于强加的线性组成,以及与计算正则项的完整梯度的每次迭代成本,与正则化项相关联的近端映射的封闭形式解决方案不可用。当输入数据样本的数量很大时,预期的目标函数会非常高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号