...
首页> 外文期刊>Optimization Letters >Global convergence of some modified PRP nonlinear conjugate gradient methods
【24h】

Global convergence of some modified PRP nonlinear conjugate gradient methods

机译:一些改进的PRP非线性共轭梯度法的全局收敛性

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Recently, similar to Hager and Zhang (SIAM J Optim 16:170–192, 2005), Yu (Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. Thesis of Doctors Degree, Sun Yat-Sen University, 2007) and Yuan (Optim Lett 3:11–21, 2009) proposed modified PRP conjugate gradient methods which generate sufficient descent directions without any line searches. In order to obtain the global convergence of their algorithms, they need the assumption that the stepsize is bounded away from zero. In this paper, we take a little modification to these methods such that the modified methods retain sufficient descent property. Without requirement of the positive lower bound of the stepsize, we prove that the proposed methods are globally convergent. Some numerical results are also reported.
机译:最近,类似于Hager和Zhang(SIAM J Optim 16:170–192,2005),Yu(用于大规模优化问题的非线性自缩放共轭梯度方法。中山大学博士学位论文,2007)和Yuan(Optim Lett 3:11–21,2009)提出了改进的PRP共轭梯度方法,该方法可生成足够的下降方向,而无需进行任何线搜索。为了获得其算法的全局收敛性,他们需要假设步距限制为零。在本文中,我们对这些方法进行了一些修改,以使修改后的方法保留足够的下降特性。在不要求步长为正的下限的情况下,我们证明了所提出的方法是全局收敛的。还报告了一些数值结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号