首页> 外文学位 >BCAP: An Artificial Neural Network Pruning Technique to Reduce Overfitting.
【24h】

BCAP: An Artificial Neural Network Pruning Technique to Reduce Overfitting.

机译:BCAP:减少过度拟合的人工神经网络修剪技术。

获取原文
获取原文并翻译 | 示例

摘要

Determining the optimal size of a neural network is complicated. Neural networks, with many free parameters, can be used to solve very complex problems. However, these neural networks are susceptible to overfitting. BCAP (Brantley-Clark Artificial Neural Network Pruning Technique) addresses overfitting by combining duplicate neurons in a neural network hidden layer, thereby forcing the network to learn more distinct features. We compare hidden units using the cosine similarity, and combine those that are similar with each other within a threshold epsilon. By doing so the co-adaption of the neurons in the network is reduced because hidden units that are highly correlated (i.e. similar) are combined. In this paper we show evidence that BCAP is successful in reducing network size while maintaining accuracy, or improving accuracy of neural networks during and after training.
机译:确定神经网络的最佳大小很复杂。具有许多自由参数的神经网络可用于解决非常复杂的问题。但是,这些神经网络容易过度拟合。 BCAP(布兰特利-克拉克人工神经网络修剪技术)通过在神经网络隐藏层中组合重复的神经元来解决过度拟合问题,从而迫使网络学习更多独特的功能。我们使用余弦相似度比较隐藏的单位,并在阈值epsilon中组合彼此相似的单位。通过这样做,因为高度相关(即,相似)的隐藏单元被组合,所以减少了网络中神经元的共适应。在本文中,我们显示了证据,即BCAP在减少网络规模的同时保持准确性,或在训练期间和训练后提高了神经网络的准确性方面均取得了成功。

著录项

  • 作者

    Brantley, Kiante.;

  • 作者单位

    University of Maryland, Baltimore County.;

  • 授予单位 University of Maryland, Baltimore County.;
  • 学科 Artificial intelligence.;Computer science.
  • 学位 M.S.
  • 年度 2016
  • 页码 71 p.
  • 总页数 71
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号