首页> 外文会议>International Conference on Mechatronics and Automation >Common Nature of Learning between BP and Hopfield-Type Neural Networks for Convex Quadratic Minimization with Simplified Network Models
【24h】

Common Nature of Learning between BP and Hopfield-Type Neural Networks for Convex Quadratic Minimization with Simplified Network Models

机译:具有简化网络模型的BP和Hopfield型神经网络学习的常见性质

获取原文

摘要

In this paper, two different types of neural networks are investigated and employed for the online solution of strictly-convex quadratic minimization; i.e., a two-layer back-propagation neural network (BPNN) and a discrete-time Hopfield-type neural network (HNN). As simplified models, their error-functions could be defined directly as the quadratic objective function, from which we further derive the weight-updating formula of such a BPNN and the state-transition equation of such an HNN. It is shown creatively that the two derived learning-expressions turn out to be the same (in mathematics), although the presented neural-networks are evidently different from each other a great deal, in terms of network architecture, physical meaning and training patterns. Computer-simulations further substantiate the efficacy of both BPNN and HNN models on convex quadratic minimization and, more importantly, their common nature of learning.
机译:在本文中,研究了两种不同类型的神经网络,并用于严格凸起的二次最小化的在线解决方案;即,双层背部传播神经网络(BPNN)和离散时间Hopfield型神经网络(HNN)。作为简化模型,它们的错误功能可以直接定义为二次目标函数,我们进一步推导出这种BPNN的权重更新公式和这种HNN的状态转换方程。它是创造性地显示的,这两个派生的学习表达式结果是相同的(在数学中),尽管在网络架构,物理意义和训练模式方面,所呈现的神经网络显然不同于彼此的大量优惠。计算机模拟进一步证实了BPNN和HNN模型对凸二次最小化的功效,更重要的是,他们的学习常见性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号