首页> 外文会议>Workshop on Asian translation >CUNI NMT System for WAT 2017 Translation Tasks
【24h】

CUNI NMT System for WAT 2017 Translation Tasks

机译:CUNI NMT System for Wat 2017翻译任务

获取原文

摘要

The paper presents this year's CUNI submissions to the WAT 2017 Translation Task focusing on the Japanese-English translation, namely Scientific papers sub-task, Patents subtask and Newswire sub-task. We compare two neural network architectures, the standard sequence-to-sequence with attention (Seq2Seq) (Bah-danau et al., 2014) and an architecture using convolutional sentence encoder (FB-Conv2Seq) described by Gehring et al. (2017), both implemented in the NMT framework Neural Monkey~1 that we currently participate in developing. We also compare various types of preprocessing of the source Japanese sentences and their impact on the overall results. Furthermore, we include the results of our experiments with out-of-domain data obtained by combining the corpora provided for each sub-task.
机译:本文将今年的CUNI提交给WAT 2017翻译任务,重点是日语 - 英语翻译,即科学论文分任务,专利子摊和新闻界子任务。我们比较两个神经网络架构,标准序列与注意力(SEQ2Seq)(Bah-danau等,2014)和使用Gehring等人描述的卷积句编码器(FB-CONV2SEQ)的架构。 (2017),两者都在NMT框架神经猴中实施,我们目前参与发展。我们还比较各种类型的日语句子的预处理及其对整体结果的影响。此外,我们包括我们的实验结果,通过组合为每个子任务提供的Corpora来获得的域名数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号