首页> 外文会议>Conference on empirical methods in natural language processing >Language Modeling Teaches You More Syntax than Translation Does: Lessons Learned Through Auxiliary Task Analysis
【24h】

Language Modeling Teaches You More Syntax than Translation Does: Lessons Learned Through Auxiliary Task Analysis

机译:语言建模教会比翻译更多的语法:通过辅助任务分析学习的经验教训

获取原文

摘要

By controlling for the genre and quantity of the training data, we make fair comparisons between several data-rich training tasks in their ability to induce syntactic information. Our results suggest that for transfer learning, bidirectional language models like ELMo (Peters et al., 2018) capture more useful features than translation encoders-and that this holds even on genres for which data is not abundant. Our work also highlights the interesting behavior of untrained LSTMs, which show an ability to preserve the contents of their inputs better than trained models.
机译:通过控制培训数据的类型和数量,我们在其诱导句法信息的能力中进行了几个数据丰富的培训任务之间进行了公平的比较。我们的结果表明,对于转移学习,Elmo(Peters等,2018)这样的双向语言模型(Peters等,2018)捕获比翻译编码器更有用的功能 - 即使在数据不富裕的流派上也可以保持这种情况。我们的工作也强调了未训练的LSTMS的有趣行为,这表明能够比训练有素的模型更好地保护其投入内容的能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号