首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent
【24h】

Efficient Algorithm for Sparse Tensor-variate Gaussian Graphical Models via Gradient Descent

机译:梯度下降的稀疏张量变量高斯图形模型的高效算法

获取原文
       

摘要

We study the sparse tensor-variate Gaussian graphical model (STGGM), where each way of the tensor follows a multivariate normal distribution whose precision matrix has sparse structures. In order to estimate the precision matrices, we propose a sparsity constrained maximum likelihood estimator. However, due to the complex structure of the tensor-variate GGMs, the likelihood based estimator is non-convex, which poses great challenges for both computation and theoretical analysis. In order to address these challenges, we propose an efficient alternating gradient descent algorithm to solve this estimator, and prove that, under certain conditions on the initial estimator, our algorithm is guaranteed to linearly converge to the unknown precision matrices up to the optimal statistical error. Experiments on both synthetic data and real world brain imaging data corroborate our theory.
机译:我们研究了稀疏张量变量高斯图形模型(STGGM),其中张量的每种方式遵循多元正态分布,其精度矩阵具有稀疏结构。为了估计精度矩阵,我们提出了一个稀疏约束最大似然估计器。然而,由于张量变量GGM的结构复杂,基于似然的估计量是非凸的,这对计算和理论分析都提出了很大的挑战。为了解决这些挑战,我们提出了一种有效的交替梯度下降算法来求解该估计量,并证明在初始估计量的某些条件下,我们的算法可以保证线性收敛到未知精度矩阵,直到最佳统计误差为止。 。对合成数据和现实世界中的大脑成像数据进行的实验证实了我们的理论。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号