【24h】

Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

机译:可扩展的神经句子嵌入式的交叉转移

获取原文

摘要

We develop and investigate several cross-lingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.
机译:我们开发和调查神经句嵌入模型的几种交叉旋转对齐方法,例如监督推断分类器,Infersent和顺序编码器 - 解码器模型。我们评估应用于这些模型的三个对齐框架:联合建模,表示转移学习和句子映射,使用并行文本来指导对齐。我们的结果支持表示转移作为神经句嵌入的模块交叉对齐的可扩展方法,与内在和外在评估中的联合模型相比,我们观察更好的性能,特别是具有较小的并联数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号