...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Universal Approximation with Deep Narrow Networks
【24h】

Universal Approximation with Deep Narrow Networks

机译:与深窄网络的普遍近似

获取原文
           

摘要

The classical Universal Approximation Theorem holds for neural networks of arbitrary width and bounded depth. Here we consider the natural ‘dual’ scenario for networks of bounded width and arbitrary depth. Precisely, let $n$ be the number of inputs neurons, $m$ be the number of output neurons, and let $ho$ be any nonaffine continuous function, with a continuous nonzero derivative at some point. Then we show that the class of neural networks of arbitrary depth, width $n + m + 2$, and activation function $ho$, is dense in $C(K; mathbb{R}^m)$ for $K subseteq mathbb{R}^n$ with $K$ compact. This covers every activation function possible to use in practice, and also includes polynomial activation functions, which is unlike the classical version of the theorem, and provides a qualitative difference between deep narrow networks and shallow wide networks. We then consider several extensions of this result. In particular we consider nowhere differentiable activation functions, density in noncompact domains with respect to the $L^p$-norm, and how the width may be reduced to just $n + m + 1$ for ‘most’ activation functions.
机译:经典普遍近似定理适用于任意宽度和有界深度的神经网络。在这里,我们考虑有限宽度和任意深度网络的自然“双重”场景。准确地说,让$ N $是输入神经元的数量,$ M $的输出神经元数,让$ rho $是任何非共轭连续功能,在某些时候具有连续的非零衍生物。然后我们展示了任意深度的神经网络的类,宽度$ n + m + 2 $和激活函数$ rho $,以$ c为单位(k; mathbb {r} ^ m)$ k subseteq mathbb {r} ^ n $ k $ compact。这涵盖了可以在实践中使用的每个激活功能,并且还包括多项式激活功能,它与定理的经典版本不同,并且在深窄网络和浅网络之间提供定性差异。然后我们考虑这一结果的几个扩展。特别地,我们考虑无处不通的激活功能,非常见域中的密度相对于$ l ^ p $ -norm,以及如何将宽度降低到“最大”的激活功能的N + M + 1 $。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号