...
首页> 外文期刊>Japan journal of industrial and applied mathematics >Error bounds for ReLU networks with depth and width parameters
【24h】

Error bounds for ReLU networks with depth and width parameters

机译:Error bounds for ReLU networks with depth and width parameters

获取原文
获取原文并翻译 | 示例
           

摘要

Abstract Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We construct the specific neural network architecture with a local connection which is universal approximator, and analyze its approximation error. This locally connected networks has higher application than one with the full connection because the locally connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号