首页> 外文会议> >A comparison of weight elimination methods for reducing complexity in neural networks
【24h】

A comparison of weight elimination methods for reducing complexity in neural networks

机译:减少神经网络复杂度的权重消除方法的比较

获取原文

摘要

Three methods are examined for reducing complexity in potentially oversized networks. These consists of either removing redundant elements based on some measure of saliency, adding a further term to the cost function penalizing complexity, or observing the error on a further, validation set of examples, and then stopping training as soon as this performance begins to deteriorate. It was demonstrated on a series of simulation examples that all of these methods can significantly improve generalization, but their performance can prove to be domain dependent.
机译:为了降低潜在超大网络的复杂性,研究了三种方法。这些步骤包括:基于某种显着性度量来删除冗余元素,在代价函数中添加进一步的术语,以惩罚复杂性,或者在进一步的验证示例集上观察错误,然后在性能开始下降时立即停止培训。 。在一系列的仿真示例中证明,所有这些方法都可以显着改善泛化性,但是它们的性能可以证明是与域相关的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号