【24h】

Neural Fine-Grained Entity Type Classification with Hierarchy-Aware Loss

机译:具有层次结构感知损失的神经细粒度实体类型分类

获取原文

摘要

The task of Fine-grained Entity Type Classification (FETC) consists of assigning types from a hierarchy to entity mentions in text. Existing methods rely on distant supervision and are thus susceptible to noisy labels that can be out-of-context or overly-specific for the training sentence. Previous methods that attempt to address these issues do so with heuristics or with the help of hand-crafted features. Instead, we propose an end-to-end solution with a neural network model that uses a variant of cross-entropy loss function to handle out-of-context labels, and hierarchical loss normalization to cope with overly-specific ones. Also, previous work solve FETC a multi-label classification followed by ad-hoc post-processing. In contrast, our solution is more elegant: we use public word embeddings to train a single-label that jointly learns representations for entity mentions and their context. We show experimentally that our approach is robust against noise and consistently outperforms the state-of-the-art on established benchmarks for the task.
机译:细粒度实体类型分类(FETC)的任务包括将层次结构中的类型分配给文本中的实体提及。现有的方法依赖于远程监督,因此容易受到嘈杂的标签的影响,这些标签对于训练句子而言可能是上下文无关或过于具体的。以前尝试解决这些问题的方法都是通过试探法或在手工制作的功能的帮助下完成的。相反,我们提出了一种具有神经网络模型的端到端解决方案,该模型使用交叉熵损失函数的变体来处理上下文外标签,并使用层次化损失归一化来应对过于具体的标签。此外,先前的工作还解决了FETC的多标签分类问题,然后进行了临时的后处理。相比之下,我们的解决方案更为优雅:我们使用公共词嵌入来训练单个标签,从而共同学习实体提及及其上下文的表示形式。我们通过实验表明,我们的方法具有出色的抗噪声能力,并且在性能指标方面始终优于最新技术。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号