【24h】

Adaptive Learning Algorithms to Incroporate Additional Functional Constraints into Neural Networks

机译:自适应学习算法将附加功能约束渗入神经网络

获取原文

摘要

In this paper, adaptive learnign algorithms to obtain better generalization performance are proposed. We specifically designed cost terms for the additional functinnality based on the first and second order derivatives defined at hiden layer. In the course of training these additional cost functions penalize the input-to-output mapping sensitivity and large curvatures contained in training data, repectively. A gradient-descent method results in hybrid learning rules to combine the error back-propagation, Hebbian ruels and the simple weight decay rules. However, additional computational requirements to the standard error back-propagation algorithm are almost negligible.
机译:本文提出了一种自适应学习算法,以获得更好的泛化性能。我们根据隐藏层定义的一阶和二阶导数专门设计了附加功能的成本条件。在训练过程中,这些额外的成本函数分别惩罚了训练数据中包含的输入到输出映射灵敏度和大曲率。梯度下降法产生混合学习规则,以结合误差反向传播,Hebbian准则和简单权重衰减规则。但是,标准误差反向传播算法的其他计算要求几乎可以忽略不计。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号