首页> 外文会议>Annual meeting of the Association for Computational Linguistics >CSE: Conceptual Sentence Embeddings based on Attention Model
【24h】

CSE: Conceptual Sentence Embeddings based on Attention Model

机译:CSE:基于注意力模型的概念句子嵌入

获取原文
获取外文期刊封面目录资料

摘要

Most sentence embedding models typically represent each sentence only using word surface, which makes these models indis-criminative for ubiquitous homonymy and polysemy. In order to enhance representation capability of sentence, we employ conceptualization model to assign associated concepts for each sentence in the tex-t corpus, and then learn conceptual sentence embedding (CSE). Hence, this semantic representation is more expressive than some widely-used text representation models such as latent topic model, especially for short-text. Moreover, we further extend CSE models by utilizing a local attention-based model that select relevant words within the context to make more efficient prediction. In the experiments, we evaluate the CSE models on two tasks, text classification and information retrieval. The experimental results show that the proposed models outperform typical sentence embed-ding models.
机译:大多数句子嵌入模型通常仅使用单词表面来表示每个句子,这使得这些模型对于普遍存在的同音异义和多义同义是非歧视性的。为了增强句子的表示能力,我们采用概念化模型为tex-t语料库中的每个句子分配相关的概念,然后学习概念性句子嵌入(CSE)。因此,这种语义表示比某些广泛使用的文本表示模型(例如潜在主题模型)更具表现力,尤其是对于短文本而言。此外,我们通过利用基于局部注意的模型进一步扩展CSE模型,该模型在上下文中选择相关单词以进行更有效的预测。在实验中,我们在两个任务(文本分类和信息检索)上评估了CSE模型。实验结果表明,所提出的模型优于典型的句子嵌入模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号