首页> 外文会议>5th workshop on automated knowledge base construction >An Attentive Neural Architecture for Fine-grained Entity Type Classification
【24h】

An Attentive Neural Architecture for Fine-grained Entity Type Classification

机译:细粒度实体类型分类的细心神经体系结构

获取原文
获取原文并翻译 | 示例

摘要

In this work we propose a novel attention-based neural network model for the task of fine-grained entity type classification that unlike previously proposed models recursively composes representations of entity mention contexts. Our model achieves state-of-the-art performance with 74.94% loose micro Fl-score on the well-established FIGER dataset, a relative improvement of 2.59% . We also investigate the behavior of the attention mechanism of our model and observe that it can learn contextual linguistic expressions that indicate the fine-grained category memberships of an entity.
机译:在这项工作中,我们为细粒度实体类型分类的任务提出了一种基于注意力的新型神经网络模型,该模型与以前提出的模型不同,是递归地构成实体提及上下文的表示形式。我们的模型在完善的FIGER数据集上以74.94%的松散Fl分数获得了最先进的性能,相对提高了2.59%。我们还研究了模型的注意机制的行为,并观察到它可以学习指示实体的细粒度类别成员资格的上下文语言表达。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号