首页> 外文期刊>Journal of Computational and Applied Mathematics >Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
【24h】

Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization

机译:几种高效梯度方法,具有近似最优的梯度方法,用于大规模无约束优化

获取原文
获取原文并翻译 | 示例
       

摘要

In this paper we introduce a new concept of approximate optimal stepsize for gradient method, use it to interpret the nice numerical effect of the Barzilai Borwein (BB) method, and present several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. Based on revising some modified BFGS update formulae, we construct some new quadratic approximation models to develop several approximate optimal stepsizes. It is remarkable that these approximate optimal stepsizes lie in the intervals which contain the two well-known BB stepsizes. We then truncate these approximate optimal stepsizes by the two well-known BB stepsizes and treat the resulted approximate optimal stepsizes as the new stepsizes for gradient methods. Moreover, for the nonconvex case, we also design a new approximation model to generate an approximate optimal stepsize for gradient methods. We establish the convergences of the proposed methods under weaker condition. Numerical results show that the proposed methods are very promising. (C) 2017 Elsevier B.V. All rights reserved.
机译:在本文中,我们介绍了梯度方法的近似最佳步骤的新概念,用它来解释Barzilai Borwein(BB)方法的良好数值效果,并以大近似最佳的近似最佳梯度方法呈现几种有效的梯度方法,用于大规模无约束优化。基于修改一些修改的BFGS更新公式,我们构建了一些新的二次近似模型来开发几种近似最佳步骤。值得注意的是,这些近似最优介绍介于包含两个众所周知的BB的间隔步骤。然后,我们截断了这些近似最佳步骤,通过两个众所周知的BB步骤,并处理导致的近似最佳步骤,因为新的梯度方法的步骤化。此外,对于非耦合案例,我们还设计了一个新的近似模型,以产生梯度方法的近似最佳步骤。我们在较弱条件下建立所提出的方法的收敛性。数值结果表明,所提出的方法非常有前途。 (c)2017年Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号