首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation
【24h】

Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

机译:风格变压器:未配对的文本风格转移,没有解开潜在的代表

获取原文

摘要

Disentangling the content and style in the latent space is prevalent in unpaired text style transfer. However, two major issues exist in most of the current neural models. 1) It is difficult to completely strip the style information from the semantics for a sentence. 2) The recurrent neural network (RNN) based encoder and decoder, mediated by the latent representation, cannot well deal with the issue of the long-term dependency, resulting in poor preservation of non-stylistic semantic content. In this paper, we propose the Style Transformer, which makes no assumption about the latent representation of source sentence and equips the power of attention mechanism in Transformer to achieve better style transfer and better content preservation. Source code will be available on Github.
机译:解开潜在空间的内容和风格在未配对文本的传输中普遍存在。然而,大多数目前的神经模型存在两个主要问题。 1)很难完全从语义中删除句子的样式信息。 2)由潜在表示的基于经常性神经网络(RNN)的编码器和解码器不能很好地处理长期依赖性的问题,从而衡量非风格语义内容的保存不佳。在本文中,我们提出了款式变压器,这不承担源码句子的潜在表示,并在变压器中提供关注机制的力量,以实现更好的风格转移和更好的内容保存。源代码将在GitHub上使用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号