...
首页> 外文期刊>Expert Systems with Application >SOM-based information maximization to improve and interpret multi-layered neural networks: From information reduction to information augmentation approach to create new information
【24h】

SOM-based information maximization to improve and interpret multi-layered neural networks: From information reduction to information augmentation approach to create new information

机译:基于SOM的信息最大化可改进和解释多层神经网络:从信息缩减到信息增强方法以创建新信息

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

The present paper aims to show the necessity of information augmentation to cope with the natural decrease in information content in multi-layered neural networks. It is natural to make an effort to collect as much information as possible, because it is impossible to know which information is necessary or useful before learning. Thus, the present paper tries to force neural networks to form new network configurations to have as much information as possible, contrary to the conventional approach of information reduction, such as many types of regularization. For information augmentation, we use the self organizing map (SOM), which can over-represent inputs and produce as many similar weights as possible. The method was applied to two data sets: the banknote authentication data set and the character recognition data set. In both experimental results, it was confirmed that redundant and excessive information generation in terms of the excessive number of connection weights was connected with improved generalization. (C) 2019 Elsevier Ltd. All rights reserved.
机译:本文旨在说明信息增强的必要性,以应对多层神经网络中信息内容自然减少的情况。努力收集尽可能多的信息是很自然的,因为在学习之前不可能知道哪些信息是必要或有用的。因此,与传统的信息约简方法(例如多种类型的正则化方法)相反,本文试图迫使神经网络形成具有尽可能多信息的新网络配置。对于信息增强,我们使用自组织映射(SOM),它可以过度代表输入并产生尽可能多的相似权重。该方法被应用于两个数据集:纸币认证数据集和字符识别数据集。在两个实验结果中,都证实了由于连接权数过多而产生的冗余信息和多余信息与改进的泛化联系在一起。 (C)2019 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号