首页> 外文期刊>Neurocomputing >Optimizing simple deterministically constructed cycle reservoir network with a Redundant Unit Pruning Auto-Encoder algorithm
【24h】

Optimizing simple deterministically constructed cycle reservoir network with a Redundant Unit Pruning Auto-Encoder algorithm

机译:使用冗余单元修剪自动编码器算法优化简单的确定性构造的循环水库网络

获取原文
获取原文并翻译 | 示例

摘要

Echo State Network (ESN) is a specific form of recurrent neural network, which displays very rich dynamics owing to its reservoir based hidden neurons. In the issue, ESN is viewed as a powerful approach to model real-valued time series processes. Nevertheless, ESN has been criticized for its manually experienced or brute-force searching parameters, such as initial input weights and reservoir layer weights, i.e., the conventional randomly generated ESN is unlikely to be optimal because the reservoir layer weights and input layer weights are created randomly. Simple Cycle Reservoir Network (SCRN), which constitutes a type of conclusively constructed input and internal layer weights, can yield performance comparable with conventional ESN. A Redundant Unit Pruning Auto-Encoder (RUP-AE) algorithm is proposed to optimize the input layer weights of SCRN and for resolving the dilemma of ill-conditioned output weights matrix in SCRN, through an unsupervised pre-training process. Initially, the output weights matrix of SCRN is pre-trained by pseudo-inverse algorithm through training data. Then, the pre-trained output weights matrix is pruned by a Redundant Unit Pruning (RUP) algorithm. Finally, the pruned output weights matrix of SCRN is injected to the input weights matrix to ensure the specificity of the autoencoder. Three tasks, namely nonlinear time series system identification task, real-valued time series benchmark, and standard chaotic time series benchmark, are applied to demonstrate the advantage and superiority of RUP-AE. Extensive experimental results show that our RUP-AE is effective in improving the performance of SCRN. Meanwhile, RUP-AE is able to resolve the dilemma of ill-conditioned output weights matrix in SCRN. (C) 2019 Elsevier B.V. All rights reserved.
机译:回声状态网络(ESN)是递归神经网络的一种特殊形式,由于其基于储层的隐藏神经元而显示出非常丰富的动态。在本期中,ESN被视为一种对实值时间序列过程进行建模的强大方法。然而,ESN因其人工经验或蛮力搜索参数(例如初始输入权重和储层权重)而受到批评,即,传统的随机生成的ESN不太可能是最优的,因为创建了储层权重和输入层权重随机地。简单循环储层网络(SCRN)构成了一种确定构造的输入和内部层权重,其性能可与常规ESN媲美。提出了一种冗余单元修剪自动编码器(RUP-AE)算法,通过无人监督的预训练过程来优化SCRN的输入层权重,并解决SCRN中病态输出权重矩阵的难题。最初,通过训练数据通过伪逆算法对SCRN的输出权重矩阵进行预训练。然后,通过冗余单位修剪(RUP)算法修剪预训练的输出权重矩阵。最后,将经过修剪的SCRN输出权重矩阵注入到输入权重矩阵中,以确保自动编码器的特异性。非线性时间序列系统识别任务,实值时间序列基准和标准混沌时间序列基准这三个任务被用来证明RUP-AE的优势和优越性。大量的实验结果表明,我们的RUP-AE可有效改善SCRN的性能。同时,RUP-AE能够解决SCRN中病态输出权重矩阵的难题。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号