...
首页> 外文期刊>Journal of physics, A. Mathematical and theoretical >Generalization from correlated sets of patterns in the perceptron
【24h】

Generalization from correlated sets of patterns in the perceptron

机译:来自Perceptron中的相关模式集的概括

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Generalization is a central aspect of learning theory. Here, we propose a framework that explores an auxiliary task-dependent notion of generalization, and attempts to quantitatively answer the following question: given two sets of patterns with a given degree of dissimilarity, how easily will a network be able to 'unify' their interpretation? This is quantified by the volume of the configurations of synaptic weights that classify the two sets in a similar manner. To show the applicability of our idea in a concrete setting, we compute this quantity for the perceptron, a simple binary classifier, using the classical statistical physics approach in the replica-symmetric ansatz. In this case, we show how an analytical expression measures the 'distance-based capacity', the maximum load of patterns sustainable by the network, at fixed dissimilarity between patterns and fixed allowed number of errors. This curve indicates that generalization is possible at any distance, but with decreasing capacity. We propose that a distance-based definition of generalization may be useful in numerical experiments with real-world neural networks, and to explore computationally sub-dominant sets of synaptic solutions.
机译:泛化是学习理论的一个重要方面。在这里,我们提出了一个框架,探索推广的辅助取决于任务的概念,并试图定量地回答以下问题:给出两套图形具有差异性的给定的程度,如何轻松地将一个网络能够“统一”他们解释?这是由两组以类似的方式进行分类的突触权重的配置的体积定量。为了表示我们在具体的设置理念的适用性,我们计算这个量的感知,一个简单的二元分类,使用副本对称拟设经典统计物理的方法。在这种情况下,我们将展示的解析表达式措施“基于距离的能力”,图案的最大负载如何由网络可持续的,在图案之间的固定相异并固定允许的错误个数。这条曲线表明一般化是可能在任何距离,但是具有降低的容量。我们建议泛化的基于距离的定义可能与真实世界的神经网络数值实验有用,并探讨在计算次优势套突触的解决方案。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号