首页> 外文会议>ISM International Statistical Conference >A New Conjugate Gradient Method with Sufficient Descent without any Line Search for Unconstrained Optimization
【24h】

A New Conjugate Gradient Method with Sufficient Descent without any Line Search for Unconstrained Optimization

机译:一种新的共轭梯度方法,具有足够的下降而无需任何线路寻求无约束优化

获取原文

摘要

Conjugate gradient methods are one of the most used methods for solving nonlinear unconstrained optimization problems, especially of large scale. Their wide applications are due to their simplicity and low memory requirement. The sufficient descent property is an important issue in the analyses and implementations of conjugate gradient methods. In this paper, a new conjugate gradient method is proposed for unconstrained optimization problems. The theoretical analysis shows that the directions generated by the new method are always satisfy the sufficient descent property, and this property is independent of the line search used. Furthermore, a numerical experiment based on comparing the new method with other known conjugate gradient methods shows that the new is efficient for some unconstrained optimization problems.
机译:共轭梯度方法是解决非线性无约束优化问题的最常用方法之一,尤其是大规模。他们的广泛应用是由于它们的简单性和低记忆要求。足够的下降特性是共轭梯度方法的分析和实施中的一个重要问题。本文提出了一种新的共轭梯度法,用于无约束优化问题。理论分析表明,新方法生成的方向始终满足足够的脱康特性,并且该属性与所使用的线路搜索无关。此外,基于与其他已知的共轭梯度方法的新方法进行比较的数值实验表明,新的对于一些无约束优化问题的新是有效的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号