【24h】

Efficient linear combination for distant n-gram models

机译:有效的线性组合,适用于远距离n-gram模型

获取原文
获取原文并翻译 | 示例

摘要

The objective of this paper is to present a large study concerning the use of distant language models. In order to combine efficiently distant and classical models, an adaptation of the back-off principle is made. Also, we show the importance of each part of a history for the prediction. In fact, each sub-history is analyzed in order to estimate its importance in terms of prediction and then a weight is associated to each class of sub-histories. Therefore, the combined models take into account the features of each history's part and not the whole history as made in other works. The contribution of distant n-gram models in terms of perplexity is significant and improves the results by 12.8%. Making the linear combination depending on sub-histories achieves an improvement of 5.3% in comparison to classical linear combination.
机译:本文的目的是提供有关使用远程语言模型的大型研究。为了有效地结合远距离模型和经典模型,对补偿原理进行了修改。此外,我们显示了历史的每个部分对于预测的重要性。实际上,对每个子历史进行分析,以便根据预测估计其重要性,然后将权重与每个子历史类别相关联。因此,组合模型考虑了每个历史部分的特征,而不是其他作品中的整个历史。遥远的n-gram模型在困惑方面的贡献非常显着,并将结果提高了12.8%。与传统的线性组合相比,根据子历史记录进行线性组合可提高5.3%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号