首页> 外文会议>SIGBioMed Workshop on Biomedical Language Processing >SB_NITK at MEDIQA 2021: Leveraging Transfer Learning for Question Summarization in Medical Domain
【24h】

SB_NITK at MEDIQA 2021: Leveraging Transfer Learning for Question Summarization in Medical Domain

机译:SB_NITK在Mediqa 2021:利用转移学习,以便在医学领域的概述摘要

获取原文

摘要

Recent strides in the healthcare domain, have resulted in vast quantities of streaming data available for use for building intelligent knowledge-based applications. However, the challenges introduced to the huge volume, velocity of generation, variety and variability of this medical data have to be adequately addressed. In this paper, we describe the model and results for our submission at MEDIQA 2021 Question Summarization shared task. In order to improve the performance of summarization of consumer health questions, our method explores the use of transfer learning to utilize the knowledge of NLP transformers like BART, T5 and PEGASUS. The proposed models utilize the knowledge of pre-trained NLP transformers to achieve improved results when compared to conventional deep learning models such as LSTM, RNN etc. Our team SB_NITK ranked 12th among the total 22 submissions in the official final rankings. Our BART based model achieved a ROUGE-2 F1 score of 0.139.
机译:最近的医疗保健领域的进步,导致了大量的流数据,可用于构建基于智能知识的应用程序。 然而,必须充分解决引入到这一医学数据的巨大,生成速度,生成速度,品种和可变性的挑战。 在本文中,我们描述了我们在Mediqa 2021问题摘要共享任务中提交的模型和结果。 为了提高消费者健康问题的总结,我们的方法探讨了转移学习的使用,利用BART,T5和Pegasus等NLP变压器的知识。 拟议的模型利用预先训练的NLP变压器的知识,与诸如LSTM,RNN等的传统深层学习模型相比,我们的团队SB_NITK在官方最终排名中的22个提交中排名第12位。 我们的基于BART的模型实现了0.139的胭脂-2 F1得分。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号