...
首页> 外文期刊>Expert systems with applications >SOM-based information maximization to improve and interpret multi-layered neural networks: From information reduction to information augmentation approach to create new information
【24h】

SOM-based information maximization to improve and interpret multi-layered neural networks: From information reduction to information augmentation approach to create new information

机译:基于SOM的信息最大化以改进和解释多层神经网络:从信息减少到信息增强方法以创建新信息

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

The present paper aims to show the necessity of information augmentation to cope with the natural decrease in information content in multi-layered neural networks. It is natural to make an effort to collect as much information as possible, because it is impossible to know which information is necessary or useful before learning. Thus, the present paper tries to force neural networks to form new network configurations to have as much information as possible, contrary to the conventional approach of information reduction, such as many types of regularization. For information augmentation, we use the self organizing map (SOM), which can over-represent inputs and produce as many similar weights as possible. The method was applied to two data sets: the banknote authentication data set and the character recognition data set. In both experimental results, it was confirmed that redundant and excessive information generation in terms of the excessive number of connection weights was connected with improved generalization. (C) 2019 Elsevier Ltd. All rights reserved.
机译:本文旨在展示信息增强的必要性,以应对多层神经网络中信息内容的自然下降。努力努力收集尽可能多的信息,因为在学习之前不可能知道哪些信息是必要的或有用的。因此,本文试图强制神经网络形成新的网络配置,以具有尽可能多的信息,与传统的信息减少方法相反,例如许多类型的正则化。对于信息增强,我们使用自组织地图(SOM),该地图(SOM)可以过度表示输入并产生尽可能多的重量。该方法应用于两个数据集:纸币认证数据集和字符识别数据集。在实验结果中,证实了在过多的连接权重方面的冗余和过度信息产生与改进的泛化相连。 (c)2019 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号