首页> 外国专利> DICTIONARY-LANGUAGE MODEL COMPRESSION METHOD, DEVICE THEREFOR, AND PROGRAM

DICTIONARY-LANGUAGE MODEL COMPRESSION METHOD, DEVICE THEREFOR, AND PROGRAM

机译:字典语言模型压缩方法,装置参考和程序

摘要

PROBLEM TO BE SOLVED: To provide a dictionary-language model compression method which can effectively reduce size of a dictionary by deleting words having little influence.;SOLUTION: A dictionary-language model compression method includes an entropy variation calculation step S10, a language model compression processing step S20, and a dictionary compression processing step S30. The entropy variation calculation step S10 calculates entropy variations when language entries for a language model are deleted by expanding into a 1-gram probability, and when entropy variations of words composing a dictionary in the 1-gram probability are equal to or less than a threshold, defines the corresponding language entries as deletion target language entries and corresponding words as deletion target words. The language model compression processing step S20 deletes the deletion target language entries from the language model. The dictionary compression processing step S30 deletes the deletion target words from the dictionary.;COPYRIGHT: (C)2014,JPO&INPIT
机译:解决的问题:提供一种字典语言模型压缩方法,该方法可以通过删除影响较小的单词来有效地减小字典的大小。解决方案:字典语言模型压缩方法包括熵变化计算步骤S10,语言模型压缩处理步骤S20和字典压缩处理步骤S30。熵变化计算步骤S10在以下情况下计算熵变化:当通过扩展成1克概率来删除用于语言模型的语言条目时,以及当以1克概率构成词典的单词的熵变化等于或小于阈值时。定义相应的语言条目为删除目标语言条目,并将相应的单词定义为删除目标单词。语言模型压缩处理步骤S20从语言模型中删除删除目标语言条目。字典压缩处理步骤S30从字典中删除删除目标词。COPYRIGHT:(C)2014,JPO&INPIT

著录项

  • 公开/公告号JP2014098760A

    专利类型

  • 公开/公告日2014-05-29

    原文格式PDF

  • 申请/专利权人 NIPPON TELEGR & TELEPH CORP NTT;

    申请/专利号JP20120249506

  • 发明设计人 MASATAKI HIROKAZU;MASUMURA AKIRA;

    申请日2012-11-13

  • 分类号G10L15/06;G10L15/187;G10L15/197;

  • 国家 JP

  • 入库时间 2022-08-21 16:18:50

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号