首页> 外文会议>Workshop on representation learning for NLP >Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization
【24h】

Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization

机译:朝着提取抽象:多个时间尺度门控复发单位总结

获取原文

摘要

In this work, we introduce temporal hierarchies to the sequence to sequence (seq2seq) model to tackle the problem of abstractive summarization of scientific articles. The proposed Multiple Timescale model of the Gated Recurrent Unit (MT-GRU) is implemented in the encoder-decoder setting to better deal with the presence of multiple compositionalities in larger texts. The proposed model is compared to the conventional RNN encoder-decoder, and the results demonstrate that our model trains faster and shows significant performance gains. The results also show that the temporal hierarchies help improve the ability of seq2seq models to capture compositionalities better without the presence of highly complex architectural hierarchies.
机译:在这项工作中,我们将时间层次结构引入序列(SEQ2Seq)模型的序列,以解决科学文章的抽象摘要问题。所提出的门控复发单元(MT-GRU)的多个时间尺度模型在编码器 - 解码器设置中实现,以更好地处理较大文本中的多种组成的存在。将所提出的模型与传统的RNN编码器解码器进行比较,结果表明我们的模型列车更快并显示出显着的性能增益。结果还表明,在没有高度复杂的建筑层次结构的情况下,时间层次结构有助于提高SEQ2SEQ模型捕获组成的能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号