首页> 美国政府科技报告 >Tensor Methods for Unconstrained Optimization Using Second Derivatives.
【24h】

Tensor Methods for Unconstrained Optimization Using Second Derivatives.

机译:使用二阶导数的无约束优化的张量方法。

获取原文

摘要

The authors introduce a new type of method for unconstrained optimization, which we call a tensor method. It is related in its basic philosophy to the tensor methods for nonlinear equations of Schnabel and Frank, but beyond that the methods have significant differences. The tensor method for unconstrained optimization bases each iteration upon a fourth order model of the objective function. This model consists of the quadratic portion of the Taylor series, plus low rank third and fourth order terms that cause the model to interpolate already calculated function and gradient values from one or more previous iterates. It is shown that the costs of forming, storing, and solving the tensor model are not significantly more than these costs for a standard method based upon a quadratic Taylor series model. Test results are presented for sets of problems where the Hessian at the minimizer is nonsingular, and where it is singular. On all the test sets, the tensor method solves considerably more problems than a comparable standard method. On problems solved by both methods, the tensor method requires about half as many iterations, and half as many function and derivative evaluations as the standard method, on the average. (KR)

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号