...
首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >Algebraic geometrical methods for hierarchical learning machines.
【24h】

Algebraic geometrical methods for hierarchical learning machines.

机译:分层学习机的代数几何方法。

获取原文
获取原文并翻译 | 示例

摘要

Hierarchical learning machines such as layered perceptrons, radial basis functions, Gaussian mixtures are non-identifiable learning machines, whose Fisher information matrices are not positive definite. This fact shows that conventional statistical asymptotic theory cannot be applied to neural network learning theory, for example either the Bayesian a posteriori probability distribution does not converge to the Gaussian distribution, or the generalization error is not in proportion to the number of parameters. The purpose of this paper is to overcome this problem and to clarify the relation between the learning curve of a hierarchical learning machine and the algebraic geometrical structure of the parameter space. We establish an algorithm to calculate the Bayesian stochastic complexity based on blowing-up technology in algebraic geometry and prove that the Bayesian generalization error of a hierarchical learning machine is smaller than that of a regular statistical model, even if the true distribution is not contained in the parametric model.
机译:分层学习机(例如分层感知器,径向基函数,高斯混合)是不可识别的学习机,其Fisher信息矩阵不是正定的。这一事实表明,传统的统计渐近理论不能应用于神经网络学习理论,例如,贝叶斯后验概率分布不收敛于高斯分布,或者泛化误差与参数数量不成比例。本文的目的是克服这个问题,并阐明分层学习机的学习曲线与参数空间的代数几何结构之间的关系。我们建立了一种基于代数几何中的爆炸技术计算贝叶斯随机复杂度的算法,并证明即使不包含真实分布,分层学习机的贝叶斯泛化误差也比常规统计模型的贝叶斯泛化误差小。参数模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号