首页> 外文会议>Asia-Pacific Signal and Information Processing Association Annual Summit and Conference >Simultaneous Fake News and Topic Classification via Auxiliary Task Learning
【24h】

Simultaneous Fake News and Topic Classification via Auxiliary Task Learning

机译:通过辅助任务学习同时使用假新闻和主题分类

获取原文

摘要

Using social media, in particular, reading news articles, has become a necessary daily activity and an important way of spreading information. Classification of topics of new articles can provide up-to-date information about the current state of politics and society. However, this convenient way of sharing information can lead to the growth of falsification. Therefore, distinguishing between real and fake news, as well as fake-news classification, have become essential and indispensable. In this paper, we propose a new and up-to-date dataset for both fake-news classification and topic classification. To the best of our knowledge, we are the first to construct a dataset with both fake-news and topic labels, and employ multitask learning for learning these two tasks simultaneously. We have collected 21K online news articles published from January 2013 to March 2020. We propose an auxiliary-task long shortterm memory (AT-LSTM) neural network for text classification via multi-task learning. We evaluate and compare our proposed model to five baseline methods, via both single-task and multitask learning, on this new benchmark dataset. Experimental results show that our proposed AT-LSTM model outperforms the single-task learning methods and the hard parametersharing multi-task learning methods. The dataset and codes will be released in the future.
机译:特别是阅读新闻文章的社交媒体已成为必要的日常活动和传播信息的重要途径。新文章主题的分类可以提供有关当前政治和社会状况的最新信息。然而,这种方便的分享方式可以导致伪造的增长。因此,区分真实和假新闻,以及假新闻分类,变得必不可少的和不可或缺。在本文中,我们提出了一个新的和最新数据集,用于假新闻分类和主题分类。据我们所知,我们是第一个使用假新闻和主题标签构建数据集,并使用多任务学习同时学习这两个任务。我们收集了从2013年1月到2020年1月发布的21K的在线新闻文章。我们提出了一种通过多任务学习的文本分类的辅助任务长期短期内存(AT-LSTM)神经网络。通过单一任务和多任务学习,我们评估并比较我们提出的模型到五种基线方法,在这个新的基准数据集上。实验结果表明,我们提出的ATSM模型优于单任务学习方法和硬参数正常的多任务学习方法。数据集和代码将来将发布。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号