首页> 外文期刊>Optimization Methods and Software >An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization
【24h】

An accelerated conjugate gradient algorithm with guaranteed descent and conjugacy conditions for unconstrained optimization

机译:保证下降和共轭条件的加速共轭梯度算法,用于无约束优化

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we suggest a new conjugate gradient algorithm that for all k≥0 both the descent and the conjugacy conditions are guaranteed. The search direction is selected as a linear combination of−g k+1 and s k , where g k+1= f(x k+1), s k =x k+1−x k and the coefficients in this linear combination are selected in such a way that both the descent and the conjugacy conditions are satisfied at every iteration. In order to define the algorithm and to prove its convergence, the modified Wolfe line search is introduced, in which the parameter in the standard second Wolfe condition is changed at every iteration. It is shown that for general nonlinear functions, the algorithm with modified Wolfe line search generates directions bounded away from infinity. The algorithm uses an acceleration scheme modifying the step length α k in such a manner as to improve the reduction of the function values along the iterations. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the computational scheme outperforms the known conjugate gradient algorithms like Hestenes and Stiefel, Polak et al., Dai and Yuan or hybrid Dai and Yuan, as well as the CG_DESCENT method by Hager and Zhang with the Wolfe line search conditions.View full textDownload full textKeywordsconjugate gradient, Wolfe line search, descent condition, conjugacy condition, unconstrained optimization AMS Subject Classification 49M20, 65K05, 90C30Related var addthis_config = { ui_cobrand: "Taylor & Francis Online", services_compact: "citeulike,netvibes,twitter,technorati,delicious,linkedin,facebook,stumbleupon,digg,google,more", pubid: "ra-4dff56cd6bb1830b" }; Add to shortlist Link Permalink http://dx.doi.org/10.1080/10556788.2010.501379
机译:在本文中,我们建议一种新的共轭梯度算法,对于所有k≥0,都可以保证下降和共轭条件。搜索方向选择为g k + 1 和s k 的线性组合,其中g k + 1 = f(x k + 1 ),s k = x k + 1 →x k 及其系数选择线性组合的方式应使每次迭代都满足下降条件和共轭条件。为了定义算法并证明其收敛性,引入了改进的Wolfe线搜索,其中在标准第二Wolfe条件下的参数在每次迭代时都会更改。结果表明,对于一般的非线性函数,使用改进的Wolfe线搜索的算法会生成远离无穷远的方向。该算法使用一种加速方案,以改进步长沿迭代的函数值的减少方式来修改步长。使用共750个无约束优化问题集与某些共轭梯度算法进行数值比较,其中一些问题来自CUTE库,其中的计算方案优于已知的共轭梯度算法,例如Hestenes和Stiefel,Polak等人,Dai和Yuan或混合Dai和Yuan以及Hager和Zhang在Wolfe线搜索条件下的CG_DESCENT方法查看全文下载关键词共轭梯度,Wolfe线搜索,下降条件,共轭条件,无约束优化AMS主题分类49M20、65K05、90C30相关变量addthis_config = {ui_cobrand:“泰勒和弗朗西斯在线”,servicescompact:“ citeulike,netvibes,twitter,technorati,delicious,linkedin,facebook,stumbleupon,digg,google,更多”,发布:“ ra-4dff56cd6bb1830b”};添加到候选列表链接永久链接http://dx.doi.org/10.1080/10556788.2010.501379

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号