首页> 外文会议>Neural Computation and Psychology Workshop >PROCESSING SYMBOLIC SEQUENCES USINGECHO-STATE NETWORKS
【24h】

PROCESSING SYMBOLIC SEQUENCES USINGECHO-STATE NETWORKS

机译:处理符号序列使用态网络

获取原文

摘要

A novel recurrent neural network (RNN) model, called the echo state network (ESN), wassuccessfully applied in several time series processing tasks. Also using ESNs forprocessing symbolic sequences seems to be a promising possibility. But ESNs, like RNNsinitialized with small weights, share some properties with variable length Markov modelswhen used for processing symbolic sequences. We call this phenomenon a Markovianarchitectural bias since meaningful and potentially a useful Markov-like state spaceorganization is present in an RNN prior to any training. In this paper we first explain thisnotion in detail. Then, in an experimental section, we compare the performance of ESNswith connectionist models explicitly using the Markovian architectural bias property andwith variable length Markov models (VLMMs). We show that ESNs performanceremains almost the same for a wide range of parameters and that the number of reservoirunits plays a similar role to the number of context units of other models. We show thatESNs, like other connectionist models with state space organized according to theMarkovian architectural bias property, cannot generalize well on subsequences notpresent in the training set.
机译:一种新型复发性神经网络(RNN)模型,称为回声状态网络(ESN),Wassuccessly应用于多个时间序列处理任务中。同样使用ESNS进行处理符号序列似乎是一个有希望的可能性。但是,像具有小权重的RNNSInitialized一样,使用用于处理符号序列的可变长度Markov模型共享一些属性。我们称之为Markovian建筑偏差,因为在任何训练之前,在RNN中存在有意义的和潜在的马尔可夫的状态集太同化。在本文中,我们首先详细解释了这一环境。然后,在实验部分中,我们使用Markovian架构偏见属性和可变长度Markov模型(VLMMS)明确地比较ESnscith连接仪模型的性能。我们表明ESNS性能emains对于各种参数几乎相同,并且储库数量对其他模型的上下文单元的数量发挥了类似的作用。我们显示那样,与根据TheMarkovian架构偏见属性组织的其他连接主义模型一样,与培训集中的后续良好不概括。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号