首页> 外文期刊>ACM transactions on intelligent systems and technology >GTAE: Graph Transformer-Based Auto-Encoders for Linguistic-Constrained Text Style Transfer
【24h】

GTAE: Graph Transformer-Based Auto-Encoders for Linguistic-Constrained Text Style Transfer

机译:GTAE:基于图形的基于变换器的自动编码器,用于语言约束文本样式转移

获取原文
获取原文并翻译 | 示例

摘要

Non-parallel text style transfer has attracted increasing research interests in recent years. Despite successes in transferring the style based on the encoder-decoder framework, current approaches still lack the ability to preserve the content and even logic of original sentences, mainly due to the large unconstrained model space or too simplified assumptions on latent embedding space. Since language itself is an intelligent product of humans with certain grammars and has a limited rule-based model space by its nature, relieving this problem requires reconciling the model capacity of deep neural networks with the intrinsic model constraints from human linguistic rules. To this end, we propose a method called Graph Transformer-based Auto-Encoder, which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level, to maximally retain the content and the linguistic structure of original sentences. Quantitative experiment results on three non-parallel text style transfer tasks show that our model outperforms state-of-the-art methods in content preservation, while achieving comparable performance on transfer accuracy and sentence naturalness.
机译:近年来,非并行文本风格转移吸引了越来越多的研究兴趣。尽管在基于编码器解码器框架转移了这种风格的成功,但目前的方法仍然缺乏保留原始句子的内容甚至逻辑的能力,主要是由于潜在嵌入空间的大不受约束的模型空间或过于简化的假设。由于语言本身是具有某些语法的人类的智能产品,并且通过其性质具有有限的规则的模型空间,因此缓解了这个问题需要与人类语言规则的内在模型约束来调和深神经网络的模型容量。为此,我们提出了一种称为图形变换器的自动编码器的方法,该方法将句子绘制为语言图,并在图表水平上执行特征提取和样式传输,以最大地保留原始句子的内容和语言结构。定量实验结果在三种非平行文本的传输任务方面表明,我们的模型优于内容保存的最先进方法,同时实现了转移准确性和句子自然的可比性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号