首页> 外文会议>International Conference on Computational Intelligence and Security >An Improved Mandarin Voice Input System Using Recurrent Neural Network Language Model
【24h】

An Improved Mandarin Voice Input System Using Recurrent Neural Network Language Model

机译:一种改进的普通话语音输入系统,使用反复性神经网络语言模型

获取原文

摘要

In this paper, we present our recent work on using a Recurrent Neural Network Language Model (RNNLM) in a Mandarin voice input system. Specifically, the RNNLM is used in conjunction with a large high-order n-gram language model (LM) to re-score the N-best list. However, it is observed that the repeated computations in the rescoring procedure can make the rescoring inefficient. Therefore, we propose a new nbest-list rescoring framework called Prefix Tree based N-best list Rescore (PTNR) to totally eliminate the repeated computations and speed up the rescoring procedure. Experiments show that the RNNLM leads to about 4.5% relative reduction of word error rate (WER). And, compared to the conventional n-best list rescoring method, the PTNR gets a speed-up of factor 3-4. Compared to the cache based method, the design of PTNR is more explicit and simpler. Besides, the PTNR requires a smaller memory footprint than the cache based method.
机译:在本文中,我们在普通话语音输入系统中使用经常性神经网络语言模型(RNNLM)来介绍我们最近的工作。 具体地,RNNLM与大型高阶N-GRAM语言模型(LM)结合使用,以重新评分N最佳列表。 然而,观察到,反复过程中的重复计算可以使得重新扫描效率低。 因此,我们提出了一种新的NBEST列表Rescoring框架,称为前缀树的基于N-Best List Rescore(PTNR),完全消除了重复计算并加快了备用过程。 实验表明,RNNLM相对减少了单词误差率(WER)的约4.5%。 而且,与传统的N-BEST列表救护方法相比,PTNR获得了因子3-4的加速。 与基于缓存的方法相比,PTNR的设计更明确,更简单。 此外,PTNR需要比基于缓存的方法更小的内存占用空间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号