首页> 外文OA文献 >Natural Language Generation Using Dependency Tree Decoding for Spoken Dialog Systems
【2h】

Natural Language Generation Using Dependency Tree Decoding for Spoken Dialog Systems

机译:使用依赖树解码对口头对话系统的自然语言生成

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this paper, we propose a new natural language generation (NLG) method for spoken dialog systems and demonstrate its capacity. Studies on NLG often employ sequence decoding, which generates the words comprising a sentence in sequential order and uses the input generated by each word in the previous step. In contrast, we propose a decoding method that employs a sequence generated by traversing a dependency tree with feed input to a pair consisting of a parent and sibling in the dependency tree. As a result, the most important words are generated first, thereby enabling words with greater relevance to be fed into the process. At prediction time, our model generates dependency trees and converts the trees into sentences. The proposed decoding method was evaluated by re-implementing a semantically controlled long short-term memory structure for NLG, and the input and predicted sequence were converted to allow dependency tree decoding. The experimental results indicated that our suggested approach, i.e., dependency tree decoding, dramatically elevates the BLEU-score and naturalness. Furthermore, when creating sentences with ${n}$ -best using dependency tree decoding, the word diversity of the output sentences was increased by approximately 6%, offering a more diverse sentence pattern.
机译:在本文中,我们提出了一种新的自然语言生成(NLG)方法,用于口语对话系统,并展示其容量。关于NLG的研究通常采用序列解码,其以顺序地生成包括句子的单词,并在前一步骤中使用每个单词生成的输入。相反,我们提出了一种解码方法,该解码方法采用由依赖于依赖树中的父和兄弟组成的对组成的对的依赖性树生成的序列。结果,首先生成最重要的单词,从而使得能够馈送更大的相关性以进入该过程。在预测时间,我们的模型生成依赖树并将树转换为句子。通过重新实现用于NLG的语义控制的长短期存储器结构来评估所提出的解码方法,并将输入和预测序列进行转换以允许依赖性树解码。实验结果表明,我们的建议方法,即依赖树解码,显着提高了BLEU的分数和自然度。此外,在使用依赖树解码的用$ {n} $ -best创建句子时,输出句子的单词分集增加了大约6%,提供了更多样化的句子模式。

著录项

  • 作者

    Youngmin Park; Sangwoo Kang;

  • 作者单位
  • 年度 2019
  • 总页数
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号