...
首页> 外文期刊>Physical review, E. Statistical physics, plasmas, fluids, and related interdisciplinary topics >Learning time series evolution by unsupervised extraction of correlations
【24h】

Learning time series evolution by unsupervised extraction of correlations

机译:通过无监督提取相关性来学习时间序列演化

获取原文
获取原文并翻译 | 示例

摘要

We focus on the problem of modeling time series by learning statistical correlations between the past and present elements of the series in an unsupervised fashion. This kind of correlation is, in general, nonlinear, especially in the chaotic domain. Therefore the learning algorithm should be able to extract statistical correlations, i.e., higher-order correlations between the elements of the time signal. This problem can be viewed as a special case of factorial learning. Factorial learning may be formulated as an unsupervised redundancy reduction between the output components of a transformation that conserves the transmitted information. An information-theoretic-based architecture and learning paradigm are introduced. The neural architecture has only one layer and a triangular structure in order to transform elements by observing only the past and to conserve the volume. In this fashion, a transformation that guarantees transmission of information without loss is formulated. The learning rule decorrelates the output components of the network. Two methods are used: higher-order decorrelation by explicit evaluation of higher-order cumulants of the output distributions, and minimization of the sum of entropies of each output component in order to minimize the mutual information between them, assuming that the entropies have an upper bound given by Gibbs second theorem. After decorrelation between the output components, the correlation between the elements of the time series can be extracted by analyzing the trained neural architecture. As a consequence, we are able to model chaotic and nonchaotic time series. Furthermore, one critical point in modeling time series is the determination of the dimension of the embedding vector used, i.e., the number of components of the past that are needed to predict the future. With this method we can detect the embedding dimension by extracting the influence of the past on the future, i.e., the correlation of remote past and future. Optimal embedding dimensions are obtained for the Hénon map and the Mackey-Glass series. When noisy data corrupted by colored noise are used, a model is still possible. The noise will then be decorrelated by the network. In the case of modeling a chemical reaction, the most natural architecture that conserves the volume is a symplectic network which describes a system that conserves the entropy and therefore the transmitted information.
机译:我们通过以无人监督的方式学习时间序列的过去和现在元素之间的统计相关性来关注对时间序列建模的问题。通常,这种相关是非线性的,尤其是在混沌域中。因此,学习算法应该能够提取统计相关性,即时间信号的元素之间的高阶相关性。这个问题可以看成是析因学习的特例。阶乘学习可以表述为保存输出信息的转换输出组件之间的无监督冗余减少。介绍了一种基于信息理论的体系结构和学习范式。为了仅通过观察过去来变换元素并节省体积,神经体系结构只有一层和三角形结构。以这种方式,提出了一种确保信息传输而不会造成损失的转换。学习规则对网络的输出组件进行解相关。使用两种方法:通过显式评估输出分布的高阶累积量进行高阶去相关;以及最小化每个输出分量的熵之和,以最小化它们之间的互信息,并假设熵具有较高的值。由吉布斯第二定理给出。在输出分量之间去相关之后,可以通过分析训练后的神经体系结构来提取时间序列元素之间的相关性。结果,我们能够对混沌和非混沌时间序列建模。此外,建模时间序列中的一个关键点是确定所使用的嵌入向量的尺寸,即,预测未来所需的过去分量的数量。通过这种方法,我们可以通过提取过去对未来的影响(即远程过去与未来的相关性)来检测嵌入维度。对于Hénon地图和Mackey-Glass系列,可以获得最佳的嵌入尺寸。当使用被彩色噪声破坏的嘈杂数据时,仍然可以使用模型。然后,噪声将由网络去相关。在对化学反应进行建模的情况下,节省体积的最自然的体系结构是辛网络,辛网络描述了一种能够保留熵并因此节省了传输信息的系统。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号