首页> 外文会议>IEEE Data Science Workshop >SAVE - SPACE ALTERNATING VARIATIONAL ESTIMATION FOR SPARSE BAYESIAN LEARNING
【24h】

SAVE - SPACE ALTERNATING VARIATIONAL ESTIMATION FOR SPARSE BAYESIAN LEARNING

机译:节省 - 稀疏贝叶斯学习的空间交替变分估计

获取原文
获取外文期刊封面目录资料

摘要

In this paper, we address the fundamental problem of sparse signal recovery in a Bayesian framework. The computational complexity associated with Sparse Bayesian Learning (SBL) renders it infeasible even for moderately large problem sizes. To address this issue, we propose a fast version of SBL using Variational Bayesian (VB) inference. VB allows one to obtain analytical approximations to the posterior distributions of interest even when exact inference of these distributions is intractable. We propose a novel fast algorithm called space alternating variational estimation (SAVE), which is a version of VB(-SBL) pushed to the scalar level. Similarly as for SAGE (space-alternating generalized expectation maximization) compared to EM, the component-wise approach of SAVE compared to SBL renders it less likely to get stuck in bad local optima and its inherent damping (more cautious progression) also leads to typically faster convergence of the non-convex optimization process. Simulation results show that the proposed algorithm has a faster convergence rate and achieves lower MSE than other state of the art fast SBL methods.
机译:在本文中,我们解决了贝叶斯框架中稀疏信号恢复的根本问题。即使对于中等大问题大小,与稀疏贝叶斯学习(SBL)相关的计算复杂性也会变得不可行。要解决此问题,我们使用变分贝叶斯(VB)推断提出了一个快速版本的SBL。 VB允许人们即使当这些分布的精确推断是棘手的,也可以获得对兴趣的后部分布的分析近似。我们提出了一种新的快速算法,称为空间交替变分估计(保存),这是推动标量级的VB(-SBL)的版本。类似地与Sage(空间交替的广义期望最大化)与EM相比,与SABL相比,与SABL相比,与SBL相比,它不太可能陷入良好的局部OPTAM,其固有阻尼(更谨慎的进展)也导致通常更快的非凸优化过程的收敛性。仿真结果表明,该算法具有更快的收敛速率,并且比其他现有技术快速SBL方法实现更低的MSE。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号