首页> 外文会议>International Symposium on Electronics and Telecommunications >P-Swish: Activation Function with Learnable Parameters Based on Swish Activation Function in Deep Learning
【24h】

P-Swish: Activation Function with Learnable Parameters Based on Swish Activation Function in Deep Learning

机译:p-swish:基于深度学习中的闪光激活函数的学习参数激活函数

获取原文

摘要

In order to improve the performance of a deep neural network, the activation function is an important aspect that we must research continuously, that is why we have expanded the research in this direction. We introduced a novel P-Swish activation function (Parametric Swish), which is able to bring performance improvements on object classification tasks using datasets such as CIFAR-10, CIFAR-100, but we will see that we also used datasets for Natural Language Processing (NLP). To test it, we used several types of architectures, including LeNet-5, Network in Network (NiN), and ResNet34 compared to popular activation functions such as sigmoid, ReLU, Swish, and our proposals. In particular, the P-Swish function facilitates fast network training, which makes it suitable for the Transfer Learning technique.
机译:为了提高深度神经网络的性能,激活函数是我们必须连续研究的一个重要方面,这就是为什么我们在这个方向上扩大了研究。我们介绍了一种新颖的P-Swish激活函数(参数播放),它能够使用像CiFar-10,CiFar-100等数据集来带来对象分类任务的性能改进,但我们会看到我们还使用用于自然语言处理的数据集(NLP)。为了测试它,我们使用了几种类型的架构,包括Lenet-5,网络中的网络(NIN)和Resnet34,与流行的激活功能相比,如Sigmoid,Relu,Swish和我们的建议。特别是,P次闪光功能有助于快速网络训练,这使得适用于传输学习技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号