首页> 外文会议>International Conference on Computational Intelligence >A Review of Weight Optimization Techniques in Recurrent Neural Networks
【24h】

A Review of Weight Optimization Techniques in Recurrent Neural Networks

机译:递归神经网络权重优化技术综述

获取原文

摘要

Recurrent neural network (RNN) has gained much attention from researchers working in the domain of time series data processing and proved to be an ideal choice for processing such data. As a result, several studies have been conducted on analyzing the time series data and data processing through a variety of RNN techniques. However, every type of RNN has its own flaws. Simple Recurrent Neural Networks (SRNN) are computationally less complex than other types of RNN such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). However, SRNN has some drawbacks such as vanishing gradient problem that makes it difficult to train when dealing with long term dependencies. The vanishing gradient exists during the training process of SRNN due to the multiplication of the gradient with small value when using the most traditional optimization algorithm the Gradient Decent (GD). Therefore, researches intend to overcome such limitations by utilizing weight optimized techniques such as metaheuristic algorithms. The objective of this paper is to present an extensive review of the challenges and issues of RNN weight optimization techniques and critically analyses the existing proposed techniques. The authors believed that the conducted review would serve as a main source of the techniques and methods used to resolve the problem of RNN time series data and data processing. Furthermore, current challenges and issues are deliberated to find promising research domains for further study.
机译:递归神经网络(RNN)在时间序列数据处理领域得到了研究人员的广泛关注,并被证明是处理此类数据的理想选择。结果,已经进行了许多研究,以分析时间序列数据和通过各种RNN技术进行的数据处理。但是,每种类型的RNN都有其自身的缺陷。简单递归神经网络(SRNN)的计算复杂度低于其他类型的RNN,例如长短期记忆(LSTM)和门控递归单元(GRU)。但是,SRNN具有一些缺点,例如消失的梯度问题,这使得在处理长期依赖项时很难进行训练。当使用最传统的优化算法Gradient Decent(GD)时,梯度的乘积较小,因此在SRNN的训练过程中会出现消失的梯度。因此,研究旨在通过利用权重优化技术(例如元启发式算法)来克服此类限制。本文的目的是对RNN权重优化技术的挑战和问题进行广泛的综述,并对现有的提议技术进行严格的分析。作者认为,进行的审查将成为解决RNN时间序列数据和数据处理问题的技术和方法的主要来源。此外,正在考虑当前的挑战和问题,以找到有前途的研究领域以供进一步研究。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号