...
首页> 外文期刊>Computational intelligence and neuroscience >Deep Neural Networks with Multistate Activation Functions
【24h】

Deep Neural Networks with Multistate Activation Functions

机译:具有多状态激活功能的深度神经网络

获取原文
   

获取外文期刊封面封底 >>

       

摘要

We propose multistate activation functions (MSAFs) for deep neural networks (DNNs). These MSAFs are new kinds of activation functions which are capable of representing more than two states, including theN-order MSAFs and the symmetrical MSAF. DNNs with these MSAFs can be trained via conventional Stochastic Gradient Descent (SGD) as well as mean-normalised SGD. We also discuss how these MSAFs perform when used to resolve classification problems. Experimental results on the TIMIT corpus reveal that, on speech recognition tasks, DNNs with MSAFs perform better than the conventional DNNs, getting a relative improvement of 5.60% on phoneme error rates. Further experiments also reveal that mean-normalised SGD facilitates the training processes of DNNs with MSAFs, especially when being with large training sets. The models can also be directly trained without pretraining when the training set is sufficiently large, which results in a considerable relative improvement of 5.82% on word error rates.
机译:我们提出了用于深度神经网络(DNN)的多状态激活函数(MSAF)。这些MSAF是新型的激活函数,能够代表两个以上的状态,包括N阶MSAF和对称MSAF。具有这些MSAF的DNN可以通过常规的随机梯度下降(SGD)以及均值归一化的SGD进行训练。我们还将讨论这些MSAF在解决分类问题时的性能。 TIMIT语料库上的实验结果表明,在语音识别任务上,具有MSAF的DNN的性能要优于传统DNN,在音素错误率方面的相对改进为5.60%。进一步的实验还表明,均值归一化的SGD有助于使用MSAF进行DNN的训练过程,尤其是在具有大型训练集的情况下。当训练集足够大时,还可以直接训练模型而无需进行预训练,这会导致单词错误率的5.82%的相对改善。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号