首页> 外文会议>2011 Third World Congress on Nature and Biologically Inspired Computing >Generalized Levenberg-Marquardt neural nets for minimization of quasiconvex scalar functions
【24h】

Generalized Levenberg-Marquardt neural nets for minimization of quasiconvex scalar functions

机译:用于最小化拟凸标量函数的广义Levenberg-Marquardt神经网络

获取原文

摘要

Neural nets that minimize quasi-convex scalar functions are designed as dynamical systems (ordinary differential equations) which correspond to various well known discrete time algorithms, such as steepest descent, Newton, Levenberg-Marquardt, etc. The main contribution is a generalization of the Levenberg-Marquardt algorithm, including an adaptive version, that combines good features of the Newton and Levenberg-Marquardt algorithms and leads to trajectories that converge faster to the minimum of a quasiconvex objective function that is assumed to have known gradient and Hessian.1
机译:将准凸标量函数最小化的神经网络被设计为动力系统(常微分方程),它对应于各种众所周知的离散时间算法,例如最速下降,牛顿,莱文贝格-马夸特等。 Levenberg-Marquardt算法(包括自适应版本)结合了牛顿算法和Levenberg-Marquardt算法的良好功能,并导致轨迹收敛到假定具有已知梯度和Hessian的拟凸目标函数的最小值。 1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号