首页> 外文会议>Youth Academic Annual Conference of Chinese Association of Automation >On Theoretical Analysis of Single Hidden Layer Feedforward Neural Networks with Relu Activations
【24h】

On Theoretical Analysis of Single Hidden Layer Feedforward Neural Networks with Relu Activations

机译:具有Relu激活的单隐层前馈神经网络的理论分析

获取原文

摘要

During past decades, extreme learning machine has acquired a lot of popularity due to its fast training speed and easy-implementation. Though extreme learning machine has been proved valid when using an infinitely differentiable function like sigmoid as activation, existed extreme learning machine theory pays a little attention to consider non-differentiable function as activation. However, other non-differentiable activation function, rectifier linear unit (Relu) in particular, has been demonstrated to enable better training of deep neural networks, compared to previously wide-used sigmoid activation. And today Relu is the most popular choice for deep neural networks. Therefore in this note, we consider extreme learning machine that adopts non-smooth function as activation, proposing that a Relu activated single hidden layer feedforward neural network (SLFN) is capable of fitting given training data points with zero error under the condition that sufficient hidden neurons are provided at the hidden layer. The proof relies on a slightly different assumption from the original one but remains easy to satisfy. Besides, we also found that the squared fitting error function is monotonically non-increasing with respect to the number of hidden nodes, which in turn means a much wider SLFN owns much expressive capacity.
机译:在过去的几十年中,极限学习机因其快速的训练速度和易于实现的功能而广受欢迎。尽管极限学习机在使用像S形那样的无限微分函数作为激活时已被证明是有效的,但现有的极限学习机理论却很少注意将不可微分函数视为激活。但是,与以前广泛使用的S型激活相比,其他不可微激活函数(尤其是整流器线性单元(Relu))已被证明能够更好地训练深度神经网络。如今,Relu已成为深度神经网络最流行的选择。因此,在本说明中,我们考虑了采用非平滑功能作为激活的极限学习机,提出了Relu激活的单隐藏层前馈神经网络(SLFN)在足够隐藏的条件下能够以零误差拟合给定的训练数据点的能力在隐藏层提供了神经元。该证明所依据的假设与原始假设略有不同,但仍易于满足。此外,我们还发现平方拟合误差函数相对于隐藏节点的数量单调增加,这反过来意味着更宽的SLFN具有很大的表达能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号