首页> 外文会议>International Joint Conference on Neural Networks >Dynamic Global-Local Attention Network Based On Capsules for Text Classification
【24h】

Dynamic Global-Local Attention Network Based On Capsules for Text Classification

机译:基于胶囊的文本动态全球本地注意力网络

获取原文

摘要

Text classification requires a comprehensive consideration of global and local information for the text. However, most methods only treat the global and local features of the text as two separate parts and ignore the relationship between them. In this paper, we propose a Dynamic Global-Local Attention Network based on Capsules (DGLA) that can use global features to dynamically adjust the importance of local features (e.g., sentence-level features or phrase-level features). The global features of the text are extracted by the capsule network, which can capture the mutual positional relationship of the input features to mine more hidden information. Furthermore, we have designed two global-local attention mechanisms within DGLA to measure the importance of two different local features and effectively leverage the advantages of these two attention mechanisms through the residual network. The performance of the model was evaluated on seven benchmark text classification datasets, and DGLA achieved the highest accuracy on all datasets. Ablation experiments show that the global-local attention mechanism can significantly improve the performance of the model.
机译:文本分类需要全面考虑文本的全局和本地信息。但是,大多数方法仅将文本的全局和局部特征视为两个单独的部分,而忽略它们之间的关系。在本文中,我们提出了一种基于胶囊的动态全局-局部注意网络(DGLA),该网络可以使用全局特征来动态调整局部特征(例如,句子级特征或短语级特征)的重要性。文本的全局特征由胶囊网络提取,该网络可以捕获输入特征的相互位置关系,以挖掘更多的隐藏信息。此外,我们在DGLA中设计了两个全局本地关注机制,以衡量两个不同本地特征的重要性,并通过残差网络有效利用这两个关注机制的优势。在七个基准文本分类数据集上评估了模型的性能,并且DGLA在所有数据集上均达到了最高的准确性。消融实验表明,全局局部注意机制可以显着提高模型的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号