首页> 外文会议>International Symposium on Artificial Intelligence and Robotics >Chinese Poetry and Couplet Automatic Generation Based on Self-attention and Multi-task Neural Network Model
【24h】

Chinese Poetry and Couplet Automatic Generation Based on Self-attention and Multi-task Neural Network Model

机译:基于自我关注和多任务神经网络模型的中国诗歌和对联自动化

获取原文

摘要

Poetry and couplets, as a valuable part of human cultural heritage, carry traditional Chinese culture. Auto-generation couplet and poetry writing are challenges for NLP. This paper proposed a new multi-task neural network model for the automatic generation of poetry and couplets. The model used seq2seq encoding and decoding structure, which combined attention mechanism, self-attention mechanism and multi-task learning parameter sharing. The encoding part used two BiLSTM networks to learn the similar characteristics of ancient poems and couplets, one for encoding keywords and the other for encoding generated poems or couplet sentences. The decoding parameters were not shared. It consisted of two LSTM networks which decode the output of ancient poems and couplets, respectively, in order to preserve the different semantic and grammatical features of ancient poems and couplets. Poetry and couplets have many similar characteristics, and multi-task learning can learn more features through related tasks, making the model more generalized. Therefore, we used multi-task model to generate poems and couplets, which is significantly better than single-task model. Also our model introduced a self-attention mechanism to learn the dependency and internal structure of words in sentences. Finally, the effectiveness of the method was verified by automatic and manual evaluations.
机译:诗歌和对联,作为人类文化遗产的宝贵部分,携带传统的中国文化。自动生成对联和诗歌写作是NLP的挑战。本文提出了一种新的多任务神经网络模型,用于自动生成诗歌和对联。模型采用SEQ2SEQ编码和解码结构,它组合了注意力机制,自我关注机制和多任务学习参数共享。编码部分使用了两个Bilstm网络来学习古诗歌和对联的类似特征,一个用于编码关键字,另一个用于编码生成的诗歌或对联句子。解码参数未共享。它包括两个LSTM网络,分别分别解码古诗歌和对联的输出,以保留古诗和对联的不同语义和语法特征。诗歌和对联具有许多类似的特征,多任务学习可以通过相关任务了解更多功能,使模型更广泛。因此,我们使用多任务模型来生成诗歌和对联,这明显优于单任务模型。我们的模型也介绍了一种自我关注机制,以了解句子中单词的依赖和内部结构。最后,通过自动和手动评估验证了该方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号