首页> 外文期刊>International Journal of Adaptive Control and Signal Processing >Bounds on the complexity of neural-network models and comparison with linear methods
【24h】

Bounds on the complexity of neural-network models and comparison with linear methods

机译:关于神经网络模型的复杂性的界线以及与线性方法的比较

获取原文
获取原文并翻译 | 示例
       

摘要

A class of non-linear models having the structure of combinations of simple, parametrized basis functions is investigated; this class includes widespread neural networks in which the basis functions correspond to the computational units of a type of networks. Bounds on the complexity of such models are derived in terms of the number of adjustable parameters necessary for a given modelling accuracy. These bounds guarantee a more advantageous tradeoff than linear methods between modelling accuracy and model complexity: the number of parameters may increase much more slowly, in some cases only polynomially, with the dimensionality of the input space in modelling tasks. Polynomial bounds on complexity allow one to cope with the so-called 'curse of dimensionality', which often makes linear methods either inaccurate or computationally unfeasible. The presented results let one gain a deeper theoretical insight into the effectiveness of neural-network architectures, noticed in complex modelling applications.
机译:研究了一类具有简单参数化基函数组合结构的非线性模型。此类包括广泛的神经网络,其中基本函数对应于一种网络的计算单元。根据给定建模精度所需的可调整参数的数量,可以得出此类模型的复杂性界限。与建模方法的精确度和模型复杂度之间的线性方法相比,这些界限保证了更有利的折衷:随着建模任务中输入空间的维数,参数数量的增加可能要慢得多,在某些情况下仅是多项式地增加。复杂性的多项式界限使人们能够应对所谓的“维数诅咒”,这常常使线性方法要么不准确,要么在计算上不可行。呈现的结果使人们对复杂的建模应用中注意到的神经网络体系结构的有效性有了更深入的理论认识。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号