首页> 外文会议>Annual Conference of the International Speech Communication Association >LSTM, GRU, Highway and a Bit of Attention: An Empirical Overview for Language Modeling in Speech Recognition
【24h】

LSTM, GRU, Highway and a Bit of Attention: An Empirical Overview for Language Modeling in Speech Recognition

机译:LSTM,GRU,高速公路和有点关注:语音识别中语言建模的实证概述

获取原文

摘要

Popularized by the long short-term memory (LSTM), multi-plicative gates have become a standard means to design artificial neural networks with intentionally organized information flow. Notable examples of such architectures include gated recurrent units (GRU) and highway networks. In this work, we first focus on the evaluation of each of the classical gated architectures for language modeling for large vocabulary speech recognition. Namely, we evaluate the highway network, lateral network, LSTM and GRU. Furthermore, the motivation underlying the highway network also applies to LSTM and GRU. An extension specific to the LSTM has been recently proposed with an additional highway connection between the memory cells of adjacent LSTM layers. In contrast, we investigate an approach which can be used with both LSTM and GRU: a highway network in which the LSTM or GRU is used as the transformation function. We found that the highway connections enable both standalone feedforward and recurrent neural language models to benefit better from the deep structure and provide a slight improvement of recognition accuracy after interpolation with count models. To complete the overview, we include our initial investigations on the use of the attention mechanism for learning word triggers.
机译:由长短期内存(LSTM)推广,多重栅极已成为设计人工神经网络的标准手段,有意组织信息流程。这些架构的显着示例包括门控经常间单元(GU)和公路网络。在这项工作中,我们首先关注对大型词汇表语音识别的语言建模的每个经典门控架构的评估。即,我们评估公路网络,横向网络,LSTM和GRU。此外,公路网络的潜在的动机也适用于LSTM和GRU。最近已经提出了对LSTM特定于LSTM的延伸,在相邻的LSTM层的存储器单元之间具有额外的公路连接。相比之下,我们研究了一种可以与LSTM和GRU一起使用的方法:使用LSTM或GRU作为变换功能的公路网络。我们发现高速公路连接使独立的前馈和经常性的神经语言模型能够从深层结构中获益,并在与计数模型中插值后提供识别准确性的略微提高。要完成概述,我们包括我们对使用注意力学习语言触发的注意机制的初步调查。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号