首页> 外文会议> >Training multilayered neural networks by replacing the least fit hidden neurons
【24h】

Training multilayered neural networks by replacing the least fit hidden neurons

机译:通过替换最不适合的隐藏神经元来训练多层神经网络

获取原文

摘要

The author discusses a supervised-learning algorithm, called GenLearn, for training multilayered neural networks. GenLearn uses techniques from the field of genetic algorithms to perform a global search of weight space and, thereby, to avoid the common problem of getting stuck in local minima. GenLearn is based on survival of the fittest hidden neuron. In searching for the most fit hidden neurons, GenLearn searches for a globally optimal internal representation of the input data. A big advantage of the GenLearn procedure over the generalized delta rule (GDR) in training three-layered neural nets is that, during each iteration of GenLearn, each weight in the first matrix is modified only once, whereas, in the GDR procedure, each weight in the first matrix is modified once for each output-layer neuron. What makes this such a big advantage is that, although GenLearn often reaches the desired mean square error in about the same number of iterations as the GDR, each iteration takes considerably less time.
机译:作者讨论了一种称为GenLearn的监督学习算法,用于训练多层神经网络。 GenLearn使用遗传算法领域的技术对权重空间进行全局搜索,从而避免陷入局部极小值这一常见问题。 GenLearn基于最适合的隐藏神经元的存活。在搜索最合适的隐藏神经元时,GenLearn会搜索输入数据的全局最佳内部表示形式。在训练三层神经网络中,GenLearn过程相对于广义增量规则(GDR)的一大优势在于,在GenLearn的每次迭代过程中,第一矩阵中的每个权重仅被修改一次,而在GDR过程中,每个矩阵都被修改了一次。每个输出层神经元一次修改第一矩阵中的权重。之所以具有如此大的优势,是因为尽管GenLearn通常以与GDR相同的迭代次数达到所需的均方误差,但每次迭代所花费的时间却要少得多。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号