【24h】

General approximation theorem on feedforward networks

机译:馈电网上的一般近似定理

获取原文
获取外文期刊封面目录资料

摘要

We show that standard feedforward neural networks with as few as a single hidden layer and arbitrary bounded nonlinear (continuous or noncontinuous) activation functions which have two unequal limits in infinities can uniformly approximate (in contrast to approximate measurably) arbitrary bounded continuous mappings on R/sup n/ with any precision. Especially, in a compact set of R/sup n/, standard feedforward neural networks with as few as a single hidden layer and arbitrary bounded nonlinear (continuous or noncontinuous) activation functions can uniformly approximate arbitrary continuous mappings with any precision. These results also hold for multi-hidden layer standard feedforward neural networks. We found that the boundedness and unequal limits at infinities conditions on the activation functions are sufficient, but not necessary.
机译:我们表明,标准的前馈神经网络与单个隐藏层和任意界限的非线性(连续或非连续)激活功能有很少有两个在信息中具有两个不相等的限制,可以均匀地近似(与近似可测量地)对R /上的任意有界连续映射使用任何精确度。特别是,在一组紧凑的R / SUP N /,标准前馈神经网络中,标准的前馈神经网络,其作为单个隐藏层和任意界限非线性(连续或非连续)激活功能可以均匀地接近任意连续映射,以任何精度均匀近似。这些结果也适用于多隐藏层标准前馈神经网络。我们发现,在激活函数上的无限条件下的有界性和不平等限制足够,但没有必要。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号