首页> 外文会议>IEEE Statistical Signal Processing Workshop >Globally Convergent Algorithms for Learning Multivariate Generalized Gaussian Distributions
【24h】

Globally Convergent Algorithms for Learning Multivariate Generalized Gaussian Distributions

机译:用于学习多变量通用高斯分布的全局收敛算法

获取原文

摘要

The multivariate generalized Gaussian distribution has been used intensively in various data analytics fields. Due to its flexibility in modeling different distributions, developing efficient methods to learn the model parameters has attracted lots of attentions. Existing algorithms including the popular fixed-point algorithms focus on learning the shape parameters and scatter matrices, but convergence is only established when the shape parameters are taken as given. When coupled with the shape parameters, convergence properties of the existing alternating algorithms remain unknown. In this paper, globally convergent algorithms based on the block majorization minimization method are proposed to jointly learn all the model parameters in the maximum likelihood estimation setting. The negative log-likelihood function w.r.t. the shape parameter is proved to be strictly convex, which to our best knowledge is the first result of this kind in the literature. Superior performance of the proposed algorithms are validated numerically based on synthetic data with comparisons to existing methods.
机译:多变量广泛的高斯分布在各种数据分析领域中被广泛使用。由于其在建模不同的分布方面的灵活性,开发高效的方法来学习模型参数已经吸引了大量的关注。现有算法包括流行的定点算法专注于学习形状参数和散射矩阵,但是仅在给定的形状参数时建立收敛。当耦合与形状参数时,现有交替算法的收敛属性仍然未知。在本文中,提出了基于块大大化最小化方法的全局收敛算法,共同了解最大似然估计设置中的所有模型参数。负对数似然函数w.r.t.被证明是严格凸的形状参数,这是我们最好的知识,这是文学中这种这种结果的结果。基于与现有方法的比较,基于合成数据验证了所提出的算法的卓越性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号