首页> 外文期刊>Optimization methods & software >A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function
【24h】

A new modified scaled conjugate gradient method for large-scale unconstrained optimization with non-convex objective function

机译:具有非凸面目标函数的大规模无约束优化的新修改缩放共轭梯度方法

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, according to the fifth-order Taylor expansion of the objective function and the modified secant equation suggested by Li and Fukushima, a new modified secant equation is presented. Also, a new modification of the scaled memoryless BFGS preconditioned conjugate gradient algorithm is suggested which is the idea to compute the scaling parameter based on a two-point approximation of our new modified secant equation. A remarkable feature of the proposed method is that it possesses a globally convergent even without convexity assumption on the objective function. Numerical results show that the proposed new modification of scaled conjugate gradient is efficient.
机译:在本文中,根据Li和Fukima建议的目标函数的第五阶泰勒膨胀和Li和Fukusima的改进的SECANT方程,提出了一种新的修改的SECANT方程。 此外,建议基于我们新修改的SENANT方程的两点近似来计算缩放的记忆BFGS预先处理的共轭梯度算法的新修改。 所提出的方法的显着特征是它即使没有对象函数的凸起假设,它也具有全局会聚。 数值结果表明,所提出的缩放共轭梯度的新修改是有效的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号