首页> 外文期刊>Big Data Mining and Analytics >Feature representations using the reflected rectified linear unit (RReLU) activation
【24h】

Feature representations using the reflected rectified linear unit (RReLU) activation

机译:使用反射的整流线性单元(RRELU)激活功能表示

获取原文
获取原文并翻译 | 示例
       

摘要

Deep Neural Networks (DNNs) have become the tool of choice for machine learning practitioners today. One important aspect of designing a neural network is the choice of the activation function to be used at the neurons of the different layers. In this work, we introduce a four-output activation function called the Reflected Rectified Linear Unit (RReLU) activation which considers both a feature and its negation during computation. Our activation function is “sparse”, in that only two of the four possible outputs are active at a given time. We test our activation function on the standard MNIST and CIFAR-10 datasets, which are classification problems, as well as on a novel Computational Fluid Dynamics (CFD) dataset which is posed as a regression problem. On the baseline network for the MNIST dataset, having two hidden layers, our activation function improves the validation accuracy from 0.09 to 0.97 compared to the well-known ReLU activation. For the CIFAR-10 dataset, we use a deep baseline network that achieves 0.78 validation accuracy with 20 epochs but overfits the data. Using the RReLU activation, we can achieve the same accuracy without overfitting the data. For the CFD dataset, we show that the RReLU activation can reduce the number of epochs from 100 (using ReLU) to 10 while obtaining the same levels of performance.
机译:深度神经网络(DNN)已成为当今机器学习从业者的首选工具。设计神经网络的一个重要方面是在不同层的神经元中使用的激活功能的选择。在这项工作中,我们介绍了一个四输出激活函数,称为反射的整流线性单位(RRELU)激活,该函数考虑了计算期间的特征及其否定。我们的激活函数是“稀疏”,因为在给定的时间内只有四种可能的输出中的两个。我们在标准MNIST和CIFAR-10数据集上测试我们的激活功能,这些数据集是分类问题,以及新颖的计算流体动力学(CFD)数据集,其被构成为回归问题。在具有两个隐藏层的Mnist DataSet的基线网络上,与众所周知的Relu激活相比,我们的激活功能将验证精度从0.09增加到0.97。对于CIFAR-10数据集,我们使用深基线网络,实现0.78验证精度与20个时期,但过度填写数据。使用RRELU激活,我们可以达到相同的准确性而无需过度拟合数据。对于CFD数据集,我们表明RRELU激活可以在获得相同的性能水平的同时将EPOCH的数量从100(使用Relu)降低到10。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号