首页> 外文期刊>Computers, Materials & Continua >A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization
【24h】

A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

机译:大规模非光滑凸优化的改进三项共轭梯度算法

获取原文
获取原文并翻译 | 示例

摘要

It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function's information but fail to solve nonsmooth problems. The perfect algorithm stems from concept of 'bundle' successfully addresses both smooth and nonsmooth complex problems, but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter's bundle. The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function's information and the technique of Moreau-Yosida regularization. Thus, a modified three-term conjugate gradient algorithm was proposed, and it has a sufficiently descent property and a trust region character. At the same time, it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms.
机译:众所周知,牛顿算法和准牛顿算法对中小规模的光滑问题有效,因为它们充分利用了相应的梯度函数的信息,但不能解决非光滑问题。完美的算法源于“捆绑”的概念,成功解决了平滑和非平滑的复杂问题,但是令人遗憾的是,由于它需要存储和更新参数捆绑的相关信息,因此它仅对中小型优化模型有效。共轭梯度算法既简单又有效,它利用了目标函数的信息和Moreau-Yosida正则化技术,因此对大规模平滑和非平滑优化模型都是有效的。因此,提出了一种改进的三项共轭梯度算法,该算法具有足够的下降特性和信赖域特征。同时,它在温和的假设下具有全局收敛性,数值试验证明了它比类似的优化算法有效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号