首页> 外文会议>Annual conference of the International Speech Communication Association;INTERSPEECH 2010 >Efficient Estimation of Maximum Entropy Language Models with TV-gram features: an SRILM extension
【24h】

Efficient Estimation of Maximum Entropy Language Models with TV-gram features: an SRILM extension

机译:具有电视语法功能的最大熵语言模型的有效估计:SRILM扩展

获取原文

摘要

We present an extension to the SRILM toolkit for training maximum entropy language models with iV-gram features. The extension uses a hierarchical parameter estimation procedure [1] for making the training time and memory consumption feasible for moderately large training data (hundreds of millions of words). Experiments on two speech recognition tasks indicate that the models trained with our implementation perform equally to or better than iV-gram models built with interpolated Kneser-Ney discounting.
机译:我们提出了SRILM工具包的扩展,用于训练具有iV-gram功能的最大熵语言模型。该扩展使用分层参数估计程序[1],以使训练时间和内存消耗对于中等规模的训练数据(亿万字)变得可行。在两个语音识别任务上进行的实验表明,通过我们的实现训练的模型在性能上优于或优于采用内插式Kneser-Ney折扣构建的iV-gram模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号