首页> 外文会议>Conference on empirical methods in natural language processing >Attention-based LSTM Network for Cross-Lingual Sentiment Classification
【24h】

Attention-based LSTM Network for Cross-Lingual Sentiment Classification

机译:基于注意的LSTM网络用于跨语言情感分类

获取原文

摘要

Most of the state-of-the-art sentiment classification methods are based on supervised learning algorithms which require large amounts of manually labeled data. However, the labeled resources are usually imbalanced in different languages. Cross-lingual sentiment classification tackles the problem by adapting the sentiment resources in a resource-rich language to resource-poor languages. In this study, we propose an attention-based bilingual representation learning model which learns the distributed semantics of the documents in both the source and the target languages. In each language, we use Long Short Term Memory (LSTM) network to model the documents, which has been proved to be very effective for word sequences. Meanwhile, we propose a hierarchical attention mechanism for the bilingual LSTM network. The sentence-level attention model learns which sentences of a document are more important for determining the overall sentiment while the word-level attention model learns which words in each sentence are decisive. The proposed model achieves good results on a benchmark dataset using English as the source language and Chinese as the target language.
机译:大多数最新的情感分类方法都是基于有监督的学习算法,这些算法需要大量的手动标记数据。但是,标记的资源通常在不同的语言中是不平衡的。跨语言情感分类通过将资源丰富的语言中的情感资源调整为资源贫乏的语言来解决该问题。在这项研究中,我们提出了一种基于注意力的双语表示学习模型,该模型学习源语言和目标语言中文档的分布式语义。在每种语言中,我们都使用长短期记忆(LSTM)网络对文档进行建模,这已被证明对单词序列非常有效。同时,我们为双语LSTM网络提出了一种分层注意机制。句子级别的注意力模型了解文档的哪些句子对于确定整体情感更重要,而单词级别的注意力模型了解每个句子中的哪些单词具有决定性。该模型在以英文为源语言,中文为目标语言的基准数据集上取得了良好的效果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号