...
首页> 外文期刊>Journal of industrial and management optimization >A MODIFIED SCALED MEMORYLESS BFGS PRECONDITIONED CONJUGATE GRADIENT ALGORITHM FOR NONSMOOTH CONVEX OPTIMIZATION
【24h】

A MODIFIED SCALED MEMORYLESS BFGS PRECONDITIONED CONJUGATE GRADIENT ALGORITHM FOR NONSMOOTH CONVEX OPTIMIZATION

机译:非光滑凸优化的改进的尺度记忆BFGS预处理共轭梯度算法

获取原文
获取原文并翻译 | 示例

摘要

This paper presents a nonmonotone scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving nonsmooth convex optimization problems, which combines the idea of scaled memoryless BFGS preconditioned conjugate gradient method with the nonmonotone technique and the Moreau-Yosida regularization. The proposed method makes use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. Under mild conditions, the global convergence of the proposed method is established. Preliminary numerical results and related comparisons show that the proposed method can be applied to solve large scale nonsmooth convex optimization problems.
机译:本文提出了一种解决非光滑凸优化问题的非单调可缩放无记忆BFGS预处理共轭梯度算法,将可缩放的无记忆BFGS预处理共轭梯度方法与非单调技术和Moreau-Yosida正则化方法相结合。所提出的方法利用了Moreau-Yosida正则化的近似函数和梯度值,而不是相应的精确值。在温和的条件下,建立了该方法的全局收敛性。初步的数值结果和相关比较表明,该方法可以解决大规模非光滑凸优化问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号