【24h】

Neural Collective Entity Linking

机译:神经集体实体链接

获取原文

摘要

Entity Linking aims to link entity mentions in texts to knowledge bases, and neural models have achieved recent success in this task. However, most existing methods rely on local contexts to resolve entities independently, which may usually fail due to the data sparsity of local information. To address this issue, we propose a novel neural model for collective entity linking, named as NCEL. NCEL applies Graph Convolutional Network to integrate both local contextual features and global coherence information for entity linking. To improve the computation efficiency, we approximately perform graph convolution on a subgraph of adjacent entity mentions instead of those in the entire text. We further introduce an attention scheme to improve the robustness of NCEL to data noise and train the model on Wikipedia hyperlinks to avoid overfitting and domain bias. In experiments, we evaluate NCEL on five publicly available datasets to verify the linking performance as well as generalization ability. We also conduct an extensive analysis of time complexity, the impact of key modules, and qualitative results, which demonstrate the effectiveness and efficiency of our proposed method.
机译:实体链接旨在将文本中的实体提及链接到知识库,并且神经模型最近在此任务中取得了成功。但是,大多数现有方法依赖于本地上下文来独立解析实体,由于本地信息的数据稀疏性,通常可能会失败。为了解决这个问题,我们提出了一种用于集体实体链接的新型神经模型,称为NCEL。 NCEL应用图卷积网络将本地上下文特征和全局一致性信息集成在一起,以进行实体链接。为了提高计算效率,我们对相邻实体提及的子图而不是整个文本中的子图进行近似的图卷积。我们进一步引入一种注意方案,以提高NCEL对数据噪声的鲁棒性,并在Wikipedia超链接上训练模型,以避免过度拟合和域偏差。在实验中,我们在五个公开可用的数据集上评估NCEL,以验证链接性能以及泛化能力。我们还对时间复杂度,关键模块的影响以及定性结果进行了广泛的分析,这些结果证明了我们提出的方法的有效性和效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号