首页> 外文会议>NAFOSTED Conference on Information and Computer Science >Fine-Tuning BERT for Sentiment Analysis of Vietnamese Reviews
【24h】

Fine-Tuning BERT for Sentiment Analysis of Vietnamese Reviews

机译:越南评论情绪分析的微调伯爵

获取原文
获取外文期刊封面目录资料

摘要

Sentiment analysis is an important task in the field of Nature Language Processing (NLP), in which users' feedback data on a specific issue are evaluated and analyzed. Many deep learning models have been proposed to tackle this task, including the recently-introduced Bidirectional Encoder Representations from Transformers (BERT) model. In this paper, we experiment with two BERT fine-tuning methods for the sentiment analysis task on datasets of Vietnamese reviews: 1) a method that uses only the [CLS] token as the input for an attached feed-forward neural network, and 2) another method in which all BERT output vectors are used as the input for classification. Experimental results on two datasets show that models using BERT slightly outperform other models using GloVe and FastText. Also, regarding the datasets employed in this study, our proposed BERT fine-tuning method produces a model with better performance than the original BERT fine-tuning method.
机译:情感分析是自然语言处理领域的重要任务,其中评估并分析了用户对特定问题的反馈数据。已经提出了许多深入学习模型来解决这项任务,包括来自变压器(BERT)模型的最近引入的双向编码器表示。在本文中,我们在越南点评的数据集上进行了两次BERT微调方法,用于越南语的数据集:1)一种使用[CLS]令牌的方法作为附加的前馈神经网络的输入,2 )另一种方法,其中所有伯特输出矢量用作分类的输入。两个数据集上的实验结果表明,使用手套和FastText使用BERT的模型略高于其他模型。此外,关于本研究中使用的数据集,我们所提出的BERT微调方法产生的模型,具有比原始频率微调方法更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号