首页> 外文会议>1st workshop on representation learning for NLP 2016 >Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization
【24h】

Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization

机译:从提取走向抽象:汇总的多个时标门控循环单元

获取原文
获取原文并翻译 | 示例

摘要

In this work, we introduce temporal hierarchies to the sequence to sequence (seq2seq) model to tackle the problem of abstractive summarization of scientific articles. The proposed Multiple Timescale model of the Gated Recurrent Unit (MT-GRU) is implemented in the encoder-decoder setting to better deal with the presence of multiple compositionalities in larger texts. The proposed model is compared to the conventional RNN encoder-decoder, and the results demonstrate that our model trains faster and shows significant performance gains. The results also show that the temporal hierarchies help improve the ability of seq2seq models to capture compositionalities better without the presence of highly complex architectural hierarchies.
机译:在这项工作中,我们将时序层次结构引入到序列到序列(seq2seq)模型中,以解决科学论文的抽象总结问题。建议的门控循环单元(MT-GRU)的多时标模型是在编码器-解码器设置中实现的,以便更好地处理较大文本中存在多个组成部分的情况。将所提出的模型与常规RNN编码器/解码器进行了比较,结果表明我们的模型训练速度更快,并且显示出显着的性能提升。结果还表明,时间层次结构有助于提高seq2seq模型更好地捕获组成的能力,而无需使用高度复杂的体系结构层次结构。

著录项

  • 来源
  • 会议地点 Berlin(DE)
  • 作者单位

    School of Electronics Engineering Kyungpook National University Daegu, South Korea;

    School of Electronics Engineering Kyungpook National University Daegu, South Korea;

    School of Electronics Engineering Kyungpook National University Daegu, South Korea;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号