首页> 外文会议>Conference of the International Speech Communication Association >Prefix Tree based N-best list Re-scoring for Recurrent Neural Network Language Model used in Speech Recognition System
【24h】

Prefix Tree based N-best list Re-scoring for Recurrent Neural Network Language Model used in Speech Recognition System

机译:基于前缀树的N-Best列表重新评分用于语音识别系统中使用的经常性神经网络语言模型

获取原文

摘要

Recurrent Neural Network Language Model (RNNLM) has recently been shown to outperform N-gram Language Models (LM) as well as many other competing advanced LM techniques. However, the training and testing of RNNLM are very time-consuming, so in real-time recognition systems, RNNLM is usually used for re-scoring a limited size of n-best list. In this paper, issues of speeding up RNNLM are explored when RNNLMs are used to re-rank a large n-best list. A new n-best list re-scoring framework, Prefix Tree based N-best list Rescoring (PTNR), is proposed to completely get rid of the redundant computations which make re-scoring ineffective. At the same time, the bunch mode technique, widely used for speeding up the training of feed-forward neural network language model, is investigated to combine with PTNR to further improve the rescoring speed. Experimental results showed that our proposed re-scoring approach for RNNLM was much faster than the standard n-best list re-scoring~1. Take 1000-best as an example, our approach was almost 11 times faster than the standard n-best list re-scoring.
机译:最近已显示经常性神经网络语言模型(RNNLM)以优于N-GRAM语言模型(LM)以及许多其他竞争高级LM技术。然而,RNNLM的训练和测试是非常耗时的,因此在实时识别系统中,RNNLM通常用于重新评分有限尺寸的N-BEST列表。在本文中,当RNNLMS用于重新排列大型N-Best列表时,探索了加速RNNLM的问题。提出了一个新的N-Best List Re-Scoring框架,前缀树的N-Best List Rescoring(PTNR),以完全摆脱使重新评分无效的冗余计算。同时,广泛用于加速前馈神经网络模型的培训的束模式技术,被研究与PTNR相结合,以进一步提高备用速度。实验结果表明,我们提出的RNNLM的再次评分方法比标准的N-Best列表重新评分〜1更快。以1000为例,我们的方法几乎比标准的N-Best List重新评分速度快11倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号