首页> 外文会议>International Conference System Modeling and Advancement in Research Trends >Vietnamese Question Answering System f rom Multilingual BERT Models to Monolingual BERT Model
【24h】

Vietnamese Question Answering System f rom Multilingual BERT Models to Monolingual BERT Model

机译:越南问题回答系统F ROM多语言BERT模型到单晶伯特模型

获取原文

摘要

A question answering (QA) system based on natural language processing and deep learning gets more attention from AI communities. Many companies and organizations are interested in developing automated question answering systems which are being researched widely. Recently, the new model named Bidirectional Encoder Representation from Transformer (BERT) was proposed to solve the restrictions of NLP tasks. BERT achieved the best results in almost tasks that include QA tasks. In this work, we tried applying the multilingual BERT models (multilingual BERT [1], DeepPavlov multilingual BERT, multilingual BERT fine-tuned on XQuAD) and the language-specific BERT model for Vietnamese (PhoBERT). The obtained result has shown that the monolingual model outperforms the multilingual models. We also recommend multilingual BERT fine-tuned on XQuAD model as an option to build a Vietnamese QA system if the system is built from a multilingual BERT based model.
机译:基于自然语言处理和深度学习的问题回答(QA)系统从AI社区获得更多关注。许多公司和组织都有兴趣开发正在广泛研究的自动问题应答系统。最近,提出了从变压器(BERT)命名的双向编码器表示的新模型,以解决NLP任务的限制。 BERT在几乎所有包括QA任务的任务中获得了最佳结果。在这项工作中,我们尝试应用多语言BERT模型(多语种BERT [1],Deeppavlov Multimanue Bert,Multimening Bert在XQuad上进行微调)以及越南语(Phobert)的语言特定BERT模型。所获得的结果表明,单语模型优于多语言模型。我们还建议在XQUAD模型上进行微语言BERT,作为构建越南QA系统的选项,如果该系统是由基于多语言BERT的模型构建的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号