【24h】

Chinese Story Generation with FastText Transformer Network

机译:用FastText变压器网络的中国故事

获取原文

摘要

The sequence transformer models are based on complex recurrent neural network or convolutional networks that include an encoder and a decoder. High-accuracy models are usually represented by used connect the encoder and decoder through an attention mechanism. Story generation is an important thing. If we can let computers learn the ability of story-telling, computers can help people do more things. Actually, the squence2squence model combine attention mechanism is being used to Chinese poetry generation. However, it difficult to apply in Chinese story generation, because there are some rules in Chinese poetry generation. Therefore, we trying to use 1372 human-labeled summarization of paragraphs from a classic novel named “Demi-Gods and Semi-Devils” (天龍八部) to train the transformer network. In our experiment, we use FastText to combine Demi-Gods and Semi-Devils Dataset and A Large Scale Chinese Short Text Summarization Dataset to be input data. In addition, we got a lower loss rate by using two layer of self-attention mechanism.
机译:序列变压器模型基于复制的经常性神经网络或卷积网络,包括编码器和解码器。高精度模型通常通过使用注意机制使用连接编码器和解码器来表示。故事一代是重要的事情。如果我们可以让计算机了解故事的能力,那么计算机可以帮助人们做更多的事情。实际上,水平模型结合了注意力机制正在习惯中国诗歌。然而,很难申请中国的故事一代,因为中国诗歌中有一些规则。因此,我们试图使用从名为“Demi-Gods和Semi-Devils”(天龙八)的经典小说的1372人的人类标签概要,以培训变压器网络。在我们的实验中,我们使用FastText将Demi-Gods和Semi-Devils DataSet和大规模的中文短文本摘要数据集结合起来,以输入数据。此外,我们通过使用两层自我关注机制获得了较低的损失率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号