首页> 外文期刊>WSEAS Transactions on Circuits and Systems >Multilayer Perceptron and Neural Networks
【24h】

Multilayer Perceptron and Neural Networks

机译:多层感知器和神经网络

获取原文
获取原文并翻译 | 示例
       

摘要

The attempts for solving linear inseparable problems have led to different variations on the number of layers of neurons and activation functions used. The backpropagation algorithm is the most known and used supervised learning algorithm. Also called the generalized delta algorithm because it expands the training way of the adaline network, it is based on minimizing the difference between the desired output and the actual output, through the downward gradient method (the gradient tells us how a function varies in different directions). Training a multilayer perceptron is often quite slow, requiring thousands or tens of thousands of epochs for complex problems. The best known methods to accelerate learning are: the momentum method and applying a variable learning rate. The paper presents the possibility to control the induction driving using neural systems.
机译:解决线性不可分问题的尝试导致了神经元层数和所用激活函数的不同变化。反向传播算法是最著名和使用的监督学习算法。也称为广义增量算法,因为它扩展了adaline网络的训练方式,它基于通过向下梯度法将所需输出与实际输出之间的差异最小化(梯度告诉我们函数如何在不同方向上变化) )。训练多层感知器通常很慢,需要数千或数万个纪元来解决复杂的问题。加速学习的最著名方法是:动量法和应用可变的学习率。本文提出了使用神经系统控制感应驱动的可能性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号