首页> 外文期刊>IEEE Transactions on Neural Networks >Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences
【24h】

Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences

机译:从在混沌符号序列上训练的递归神经网络中提取有限状态表示

获取原文
获取原文并翻译 | 示例

摘要

Concerns neural-based modeling of symbolic chaotic time series. We investigate the knowledge induction process associated with training recurrent mural nets (RNN) on single long chaotic symbolic sequences. Even though training RNN to predict the next symbol leaves the standard performance measures such as the mean square error on the network output virtually unchanged, the nets extract a lot of knowledge. We monitor the knowledge extraction process by considering the nets stochastic sources and letting them generate sequences which are then confronted with the training sequence via information theoretic entropy and cross-entropy measures. We also study the possibility of reformulating the knowledge gained by RNN in a compact easy-to-analyze form of finite-state stochastic machines. The experiments are performed on two sequences with different complexities measured by the size and state transition structure of the induced Crutchfield's /spl epsiv/-machines (1991, 1994). The extracted machines can achieve comparable or even better entropy and cross-entropy performance. They reflect the training sequence complexity in their dynamical state representations that can be reformulated using finite-state means. The findings are confirmed by a much more detailed analysis of model generated sequences. We also introduce a visual representation of allowed block structure in the studied sequences that allows for an illustrative insight into both RNN training and finite-state stochastic machine extraction processes.
机译:关注基于神经的符号混沌时间序列建模。我们研究了与在单个长混沌符号序列上训练递归壁画网(RNN)相关的知识归纳过程。即使训练RNN来预测下一个符号仍使标准性能指标(例如,网络输出上的均方误差)基本保持不变,但网络仍会提取很多知识。我们通过考虑网络随机源并让它们生成序列,然后通过信息理论熵和交叉熵度量来面对训练序列,从而监控知识提取过程。我们还研究了以紧凑,易于分析的有限状态随机机形式重新构造RNN获得的知识的可能性。实验是在两个具有不同复杂度的序列上进行的,这些序列通过诱导的Crutchfield's / spl epsiv / -machines的大小和状态转换结构来衡量(1991,1994)。提取的机器可以实现相当甚至更好的熵和互熵性能。它们以动态状态表示形式反映了训练序列的复杂性,可以使用有限状态方法重新定义。通过对模型生成的序列进行更详细的分析,证实了这一发现。我们还介绍了所研究序列中允许的块结构的可视化表示形式,以便对RNN训练和有限状态随机机器提取过程进行说明性洞察。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号