首页> 外文会议>International symposium on neural networks >A New Supermemory Gradient Method without Line Search for Unconstrained Optimization
【24h】

A New Supermemory Gradient Method without Line Search for Unconstrained Optimization

机译:一种新的SuperMemory渐变方法,无需线搜索无约束优化

获取原文

摘要

In this paper, we present a new supermemory gradient method without line search for unconstrained optimization problems. The new method can guarantee a descent at each iteration. It sufficiently uses the previous multi-step iterative information at each iteration and avoids the storage and computation of matrices associated with the Hessian of objective functions, so that it is suitable to solve large scale optimization problems. We also prove its global convergence under some mild conditions. In addition, We analyze the linear convergence rate of the new method when the objective function is uniformly convex and twice continuously differentiable.
机译:在本文中,我们介绍了一种新的超级化渐变方法,无需线搜索无约束优化问题。新方法可以保证每次迭代的下降。它充分利用了每个迭代的先前的多步迭代信息,并避免了与客观函数的Hessian相关联的矩阵的存储和计算,以便它适合于解决大规模优化问题。我们还在一些温和条件下证明了其全球收敛性。此外,当客观函数均匀凸起时,我们分析新方法的线性收敛速率,两次连续可微分。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号