【24h】

OleNet at SemEval-2019 Task 9: BERT based Multi-Perspective Models for Suggestion Mining

机译:OleNet在SemEval-2019上的任务9:基于BERT的多角度模型进行建议挖掘

获取原文

摘要

This paper describes our system participated in Task 9 of SemEval-2019: the task is focused on suggestion mining and it aims to classify given sentences into suggestion and non-suggestion classes in domain specific and cross domain training setting respectively. We propose a multi-perspective architecture for learning representations by using different classical models including Convolutional Neural Networks (CNN), Gated Recurrent Units (GRU), Feed Forward Attention (FFA), etc. To leverage the semantics distributed in large amount of unsupervised data, we also have adopted the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model as an encoder to produce sentence and word representations. The proposed architecture is applied for both sub-tasks, and achieved f1-score of 0.7812 for subtask A. and 0.8579 for subtask B. We won the first and second place for the two tasks respectively in the final competition.
机译:本文介绍了我们参与SemEval-2019任务9的系统:该任务专注于建议挖掘,目的是在特定领域和跨领域培训设置中分别将给定句子分为建议和非建议类别。我们提出了一种使用不同经典模型(包括卷积神经网络(CNN),门控循环单元(GRU),前馈注意(FFA)等)的用于学习表示的多视角架构。以利用分布在大量无监督数据中的语义,我们还采用了来自变压器的预训练双向编码器表示(BERT)模型作为编码器,以生成句子和单词表示。所建议的体系结构适用于两个子任务,并且子任务A的f1-得分分别为0.7812和子任务B的f1-得分。我们在最终比赛中分别获得了两项任务的第一名和第二名。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号