【24h】

Switch Unit (SU): A Novel Type of Unit for the Activation Function

机译:开关单元(SU):一种用于激活功能的新型单元

获取原文

摘要

The paper presents a novel unit - switch unit (SU). This unit represents a conditional include function, which is able to map the input from one dimensional space to two dimensional spaces depending on whether the input is positive or not. In the traditional neural networks, there always exist one of derivatives of two entries from output that refers to input as one and the other as zero. The proposed unit is able to cope with the vanishing gradient issues induced by the factor of activation function such as ReLUs (rectified linear units). In the experiments, different neural networks are trained based on the proposed units and tested on CIFAR-10 and CIFAR-100 datasets, which yielded the comparable results to ReLUs based neural networks using the same parameters.
机译:本文提出了一种新颖的单位开关单元(SU)。该单元表示条件包括功能,其能够根据输入是否为正,能够将从一维空间映射到二维空间的函数。在传统的神经网络中,始终存在两个来自输出的衍生物之一,从而指的是输入为零。所提出的单元能够应对由激活函数因子诱导的消失梯度问题,例如释放(整流线性单元)。在实验中,基于所提出的单元培训不同的神经网络,并在CIFAR-10和CiFAR-100数据集上进行测试,其通过使用相同的参数产生了可比的结果来利用基于的神经网络。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号