首页> 外文期刊>IEEE Transactions on Neural Networks >Mixtures-of-experts of autoregressive time series: asymptotic normality and model specification
【24h】

Mixtures-of-experts of autoregressive time series: asymptotic normality and model specification

机译:自回归时间序列的专家混合:渐近正态性和模型规范

获取原文
获取原文并翻译 | 示例

摘要

We consider a class of nonlinear models based on mixtures of local autoregressive time series. At any given time point, we have a certain number of linear models, denoted as experts, where the vector of covariates may include lags of the dependent variable. Additionally, we assume the existence of a latent multinomial variable, whose distribution depends on the same covariates as the experts, that determines which linear process is observed. This structure, denoted as mixture-of-experts (ME), is considerably flexible in modeling the conditional mean function, as shown by Jiang and Tanner. We present a formal treatment of conditions to guarantee the asymptotic normality of the maximum likelihood estimator (MLE), under stationarity and nonstationarity, and under correct model specification and model misspecification. The performance of common model selection criteria in selecting the number of experts is explored via Monte Carlo simulations. Finally, we present applications to simulated and real data sets, to illustrate the ability of the proposed structure to model not only the conditional mean, but also the whole conditional density.
机译:我们考虑一类基于局部自回归时间序列混合的非线性模型。在任何给定的时间点,我们都有一定数量的线性模型(称为专家),其中协变量的向量可能包括因变量的滞后。此外,我们假设存在一个潜在的多项式变量,其分布取决于与专家相同的协变量,该变量确定观察到的线性过程。这种结构,称为专家混合(ME),在建模条件均值函数方面具有很大的灵活性,如Jiang和Tanner所示。我们提出一种条件的形式化处理,以确保在平稳性和非平稳性以及正确的模型规范和模型错误指定的情况下,最大似然估计器(MLE)的渐近正态性。通过蒙特卡洛模拟探索了通用模型选择标准在选择专家人数方面的性能。最后,我们将应用程序应用于模拟和真实数据集,以说明所提出的结构不仅可以对条件均值建模,还可以对整个条件密度建模的能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号