首页> 外文会议>International Symposium on Information Theory and its Applications >Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference
【24h】

Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference

机译:稀疏贝叶斯专家层次混合与变分推理

获取原文

摘要

The hierarchical mixture of experts (HME) is a tree-structured probabilistic model for regression and classification. The HME has a considerable expression capability, however, the estimation of the parameters tends to overfit due to the complexity of the model. To avoid this problem, regularization techniques are widely used. In particular, it is known that a sparse solution can be obtained by L1 regularization. From a Bayesian point of view, regularization techniques are equivalent to assume that the parameters follow prior distributions and find the maximum a posteriori probability estimator. It is known that L1 regularization is equivalent to assuming Laplace distributions as prior distributions. However, it is difficult to compute the posterior distribution if Laplace distributions are assumed. In this paper, we assume that the parameters of the HME follow hierarchical prior distributions which are equivalent to Laplace distribution to promote sparse solutions. We propose a Bayesian estimation algorithm based on the variational method. Finally, the proposed algorithm is evaluated by computer simulations.
机译:专家的分层混合(HME)是用于回归和分类的树型概率模型。 HME具有相当大的表达能力,但是,由于模型的复杂性,参数的估计趋于过拟合。为了避免这个问题,正则化技术被广泛使用。特别地,已知可以通过L1正则化来获得稀疏解。从贝叶斯观点来看,正则化技术等效于假设参数遵循先验分布并找到最大后验概率估计量。众所周知,L1正则化等效于假设拉普拉斯分布为先验分布。但是,如果假设拉普拉斯分布,则很难计算后验分布。在本文中,我们假设HME的参数遵循分层的先验分布,这等效于拉普拉斯分布以促进稀疏解。我们提出了一种基于变分方法的贝叶斯估计算法。最后,通过计算机仿真对提出的算法进行了评估。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号