首页> 外文期刊>IEICE Transactions on Information and Systems >Neural Network Training Algorithm with Positive Correlation
【24h】

Neural Network Training Algorithm with Positive Correlation

机译:具有正相关的神经网络训练算法

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we present a learning approach, positive correlation learning (PCL), that creates a multilayer neural network with good generalization ability. A correlation function is added to the standard error function of back propagation learning, and the error function is minimized by a steepest-descent method. During training, all the unnecessary units in the hidden layer are correlated with necessary ones in a positive sense. PCL can therefore create positively correlated activities of hidden units in response to input patterns. We show that PCL can reduce the information on the input patterns and decay the weights, which lead to improved generalization ability. Here, the information is defined with respect to hidden unit activity since the hidden unit plays a crucial role in storing the information on the input patterns. That is, as previously proposed, the information is defined by the difference between the uncertainty of the hidden unit at the initial stage of learning and the uncertainty of the hidden unit at the final stage of learning. After deriving new weight update rules for the PCL, we applied this method to several standard benchmark classification problems such as breast cancer, diabetes and glass identification problems. Experimental results confirmed that the PCL produces positively correlated hidden units and reduces significantly the amount of information, resulting improved generalization ability.
机译:在本文中,我们提出了一种学习方法,即正相关学习(PCL),它创建了具有良好泛化能力的多层神经网络。将相关函数添加到反向传播学习的标准误差函数中,并通过最速下降法将误差函数最小化。在训练过程中,隐藏层中所有不必要的单元都与肯定单元相关。因此,PCL可以根据输入模式创建隐藏单元的正相关活动。我们证明了PCL可以减少输入模式上的信息并降低权重,从而提高泛化能力。这里,由于隐藏单元在存储有关输入模式的信息方面起着至关重要的作用,因此信息是相对于隐藏单元活动定义的。即,如先前所提出的,信息由学习的初始阶段的隐藏单元的不确定性与学习的最后阶段的隐藏单元的不确定性之差来定义。在为PCL导出新的权重更新规则之后,我们将此方法应用于几个标准基准分类问题,例如乳腺癌,糖尿病和玻璃识别问题。实验结果证实,PCL产生正相关的隐藏单元并显着减少了信息量,从而提高了泛化能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号