首页> 外文期刊>Audio, Speech, and Language Processing, IEEE/ACM Transactions on >Two Efficient Lattice Rescoring Methods Using Recurrent Neural Network Language Models
【24h】

Two Efficient Lattice Rescoring Methods Using Recurrent Neural Network Language Models

机译:使用递归神经网络语言模型的两种有效格点记录方法

获取原文
获取原文并翻译 | 示例

摘要

An important part of the language modelling problem for automatic speech recognition (ASR) systems, and many other related applications, is to appropriately model long-distance context dependencies in natural languages. Hence, statistical language models (LMs) that can model longer span history contexts, for example, recurrent neural network language models (RNNLMs), have become increasingly popular for state-of-the-art ASR systems. As RNNLMs use a vector representation of complete history contexts, they are normally used to rescore N-best lists. Motivated by their intrinsic characteristics, two efficient lattice rescoring methods for RNNLMs are proposed in this paper. The first method uses an -gram style clustering of history contexts. The second approach directly exploits the distance measure between recurrent hidden history vectors. Both methods produced 1-best performance comparable to a 10 k-best rescoring baseline RNNLM system on two large vocabulary conversational telephone speech recognition tasks for US English and Mandarin Chinese. Consistent lattice size compression and recognition performance improvements after confusion network (CN) decoding were also obtained over the prefix tree structured N-best rescoring approach.
机译:自动语音识别(ASR)系统以及许多其他相关应用程序的语言建模问题的重要部分是适当地对自然语言中的远程上下文依赖项进行建模。因此,可以对更长的历史上下文进行建模的统计语言模型(LM),例如递归神经网络语言模型(RNNLM),对于最新的ASR系统已变得越来越流行。由于RNNLM使用完整历史记录上下文的向量表示,因此它们通常用于重排N个最佳列表。鉴于其固有特性,本文提出了两种有效的RNNLM晶格记录方法。第一种方法使用历史上下文的-gram样式聚类。第二种方法直接利用循环隐藏历史矢量之间的距离度量。两种方法在针对美国英语和汉语普通话的两个大型词汇会话电话语音识别任务中产生的最佳性能相当于10k最佳记录基准RNNLM系统。还通过前缀树结构的N最佳记录方法获得了混淆网络(CN)解码后的一致的晶格大小压缩和识别性能改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号