【24h】

Deep Learning for Semantic Composition

机译:深度学习语义构成

获取原文

摘要

Learning representations to model the meaning of text has been a core problem in natural language understanding (NLP). The last several years have seen extensive interests on distributional approaches, in which text spans of different granularities are encoded as continuous vectors. If properly learned, such representations have been shown to help achieve the state-of-the-art performances on a variety of NLP problems.
机译:学习建模文本含义的表示是自然语言理解(NLP)中的核心问题。过去几年已经看到了分布方法的广泛利益,其中不同粒度的文本跨度被编码为连续载体。如果获得了适当的话,已经显示出这样的陈述有助于实现各种NLP问题的最先进的表演。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号