【24h】

More on Overfitting in Learning Discrete Patterns

机译:有关过度学习离散模式的更多信息

获取原文

摘要

Understanding and preventing overfitting is a very important issue in artificial neural network design, implementation, and application. Weigend (1994) reports that the presence and absence of overfitting in neurla networks depends on how the testing error is measured, and that there is no overfitting in terms of the classification error (symbolic-level errors). In this paper, we show that, in terms of the classification error, overfitting does occur for certain representation used to encode the discrete attributes. We design simple Boolean function with clear ratinale, and present experimental results to support our claims. In addition, we report some interesting results on the best generalization ability of networks in terms of their sizes.
机译:在人工神经网络的设计,实现和应用中,理解和防止过度拟合是一个非常重要的问题。 Weigend(1994)报告说,神经网络中是否存在过度拟合取决于测试误差的测量方式,并且在分类误差(符号级误差)方面也没有过度拟合。在本文中,我们表明,就分类错误而言,用于编码离散属性的某些表示形式确实发生了过度拟合。我们设计了简单的布尔函数,具有明显的盲目性,并提供了实验结果来支持我们的主张。另外,我们就网络的最佳概括能力报告了一些有趣的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号