首页> 外文期刊>Computer speech and language >DBMiP: A pre-training method for information propagation over deep networks
【24h】

DBMiP: A pre-training method for information propagation over deep networks

机译:DBMiP:一种用于深度网络上的信息传播的预训练方法

获取原文
获取原文并翻译 | 示例

摘要

Deep neural networks (DNNs) have recently been successful in many applications and have become a popular approach for speech recognition. Training a DNN model for speech recognition is computationally expensive due to the model large number of parameters. Pre-training improves DNN modeling. However, DNN learning is challenging if pre-training is inefficient. This paper introduces a new framework for pre-training that utilizes label information in lower layers (layers near input) for better recognition. The proposed pre-training method dynamically inserts discriminative information not only in the last layer but also in other layers. In this algorithm, the lower layers achieve more generative information while the higher layers achieve more discriminative information. In addition, this method uses speaker information by employing the Subspace Gaussian Mixture Model (SGMM), which improves recognition accuracy. Experimental results on TIMIT, MNIST, Switchboard, and English Broadcast News datasets show that this approach significantly outperforms current state-of-the-art methods such as the Deep Belief Network and the Deep Boltzmann Machine. Moreover, the proposed algorithm has minimal memory requirements. (C) 2018 Elsevier Ltd. All rights reserved.
机译:深度神经网络(DNN)最近在许多应用中都取得了成功,并已成为语音识别的流行方法。训练用于语音识别的DNN模型在计算上是昂贵的,因为该模型具有大量参数。预训练可改善DNN建模。但是,如果预训练效率低下,则DNN学习将面临挑战。本文介绍了一种新的预训练框架,该框架利用较低层(靠近输入的层)中的标签信息进行更好的识别。所提出的预训练方法不仅在最后一层而且在其他层中动态插入区分信息。在该算法中,较低的层获得更多的生成信息,而较高的层获得更多的区分信息。另外,该方法通过采用子空间高斯混合模型(SGMM)使用说话者信息,从而提高了识别准确性。在TIMIT,MNIST,总机和英语广播新闻数据集上的实验结果表明,该方法明显优于当前的最新方法,如Deep Belief Network和Deep Boltzmann Machine。而且,所提出的算法具有最小的存储器需求。 (C)2018 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号