首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >An Attention-aware Bidirectional Multi-residual Recurrent Neural Network (Abmrnn): A Study about Better Short-term Text Classification
【24h】

An Attention-aware Bidirectional Multi-residual Recurrent Neural Network (Abmrnn): A Study about Better Short-term Text Classification

机译:注意感知的双向多残差递归神经网络(Abmrnn):更好的短期文本分类研究

获取原文

摘要

Long Short-Term Memory (LSTM) has been proven an efficient way to model sequential data, because of its ability to overcome the gradient diminishing problem during training. However, due to the limited memory capacity in LSTM cells, LSTM is weak in capturing long-time dependency in sequential data. To address this challenge, we propose an Attention-aware Bidirectional Multi-residual Recurrent Neural Network (ABMRNN) to overcome the deficiency. Our model considers both past and future information at every time step with omniscient attention based on LSTM. In addition to that, the multi-residual mechanism has been leveraged in our model which aims to model the relationship between current time step with further distant time steps instead of a just previous time step. The results of experiments show that our model achieves state-of-the-art performance in classification tasks.
机译:长短期记忆(LSTM)已被证明是建模顺序数据的有效方法,因为它具有克服训练过程中梯度递减问题的能力。但是,由于LSTM单元中有限的存储容量,LSTM在捕获顺序数据中的长期依赖性方面很弱。为了解决这一挑战,我们提出了一种可感知注意力的双向多残差递归神经网络(ABMRNN),以克服这一缺陷。基于LSTM,我们的模型会在每个时间步上同时考虑过去和将来的信息,并且会全神贯注。除此之外,在我们的模型中还利用了多残差机制,该模型旨在模拟当前时间步长与更远的时间步长(而不是之前的时间步长)之间的关系。实验结果表明,我们的模型在分类任务中达到了最先进的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号