首页> 外文会议>1st EMNLP workshop blackboxNLP: analyzing and interpreting neural networks for NLP 2018 >Does Syntactic Knowledge in Multilingual Language Models Transfer Across Languages?
【24h】

Does Syntactic Knowledge in Multilingual Language Models Transfer Across Languages?

机译:多语言语言模型中的句法知识会跨语言传递吗?

获取原文
获取原文并翻译 | 示例

摘要

Recent work has shown that neural models can be successfully trained on multiple languages simultaneously. We investigate whether such models learn to share and exploit common syntactic knowledge among the languages on which they are trained. This extended abstract presents our preliminary results.
机译:最近的工作表明,神经模型可以同时在多种语言上成功地训练。我们调查了这些模型是否学会在它们所接受的语言之间共享和利用共同的句法知识。此扩展摘要介绍了我们的初步结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号