首页> 外国专利> Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling

Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling

摘要

The present disclosure generally relates to systems and processes for morpheme-based word prediction. An example method includes receiving a current word; determining a context of the current word based on the current word and a context of a previous word; determining, using a morpheme-based language model, a likelihood of a prefix based on the context of the current word; determining, using the morpheme-based language model, a likelihood of a stem based on the context of the current word; determining, using the morpheme-based language model, a likelihood of a suffix based on the context of the current word; determining a next word based on the likelihood of the prefix, the likelihood of the stem, and the likelihood of the suffix; and providing an output including the next word.

著录项

  • 公开/公告号US10657328B2

    专利类型

  • 公开/公告日2020.05.19

    原文格式PDF

  • 申请/专利权人

    申请/专利号US15851487

  • 发明设计人 Jerome R. Bellegarda;Jannes G. Dolfing;

    申请日2017.12.21

  • 分类号

  • 国家 US

  • 入库时间 2022-08-21 10:58:36

相似文献

  • 专利
  • 外文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号