首页> 外文期刊>Journal of Computational Mathematics >Globally convergent inexact generalized newton methods with decreasing norm of the gradient
【24h】

Globally convergent inexact generalized newton methods with decreasing norm of the gradient

机译:梯度范数递减的全局收敛的不精确广义牛顿法

获取原文
获取原文并翻译 | 示例

摘要

In this paper, motivated by the Martinez and Qi methods[1], we propose one type of globally convergent inexact generalized Newton methods to solve unconstrained optimiza- tion problems in which the objective functions are not twice differentiable, but have LC gra- dient. They make the norm of the gradient decreasing. These methods are implementable and globally convergent. We prove that the algorithms have superlinear convergence rates under some mile conditions. The methods may also be used to solve nonsmooth equations
机译:在本文中,受Martinez和Qi方法的启发[1],我们提出了一种全局收敛的不精确广义牛顿方法,以解决目标函数不是两次可微但具有LC梯度的无约束优化问题。它们使梯度范数减小。这些方法是可实现的,并且在全球范围内收敛。我们证明了算法在某些英里条件下具有超线性收敛速度。该方法还可用于求解非光滑方程

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号