首页> 外文会议>International Conference on Quantitative Sciences and Its Applications >Conjugate Gradient Methods with Sufficient Descent Condition for Large-scale Unconstrained Optimization
【24h】

Conjugate Gradient Methods with Sufficient Descent Condition for Large-scale Unconstrained Optimization

机译:共轭梯度方法,具有足够的下降条件的大规模无约束优化

获取原文

摘要

In this paper, we make a modification to the standard conjugate gradient method so that its search direction satisfies the sufficient descent condition. We prove that the modified conjugate gradient method is globally convergent under Armijo line search. Numerical results show that the proposed conjugate gradient method is efficient compared to some of its standard counterparts for large-scale unconstrained optimization.
机译:在本文中,我们对标准共轭梯度法进行了修改,使得其搜索方向满足足够的下降条件。我们证明修改的共轭梯度法在Armijo线搜索下全局会聚。数值结果表明,与其用于大规模无约束优化的一些标准对应物相比,所提出的共轭梯度方法是有效的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号