...
首页> 外文期刊>Signal and Information Processing over Networks, IEEE Transactions on >Communication-Efficient Decentralized Sparse Bayesian Learning of Joint Sparse Signals
【24h】

Communication-Efficient Decentralized Sparse Bayesian Learning of Joint Sparse Signals

机译:联合稀疏信号的高效通信分散式贝叶斯学习

获取原文
获取原文并翻译 | 示例

摘要

We consider the problem of decentralized estimation of multiple joint sparse vectors by a network of nodes from locally acquired noisy and underdetermined linear measurements, when the cost of communication between the nodes is at a premium. We propose an iterative, decentralized Bayesian algorithm called fusion-based distributed sparse Bayesian learning (FB-DSBL) in which the nodes collaborate by exchanging highly compressed messages to learn a common joint sparsity inducing signal prior. The learnt signal prior is subsequently used by each node to compute the maximum a posteriori probability estimate of its respective sparse vector. Since the internode communication cost is expensive, the size of the messages exchanged between nodes is reduced substantially by exchanging only those local signal prior parameters which are associated with the nonzero support detected via multiple composite log-likelihood ratio tests. The average message size is empirically shown to be proportional to the information rate of the unknown vectors. The proposed sparse Bayesian learning (SBL)-based distributed algorithm allows nodes to exploit the underlying joint sparsity of the signals. In turn, this enables the nodes to recover sparse vectors with significantly lower number of measurements compared to the standalone SBL algorithm. The proposed algorithm is interpreted as a degenerate case of a distributed consensus-based stochastic approximation algorithm for finding a fixed point of a function, and its generalized version with Robbins–Monro-type iterations is also developed. Using Monte Carlo simulations, we demonstrate that the proposed FB-DSBL has superior mean squared error and support recovery performance compared to the existing decentralized algorithms with similar or higher communication complexity.
机译:当节点之间的通信成本非常高时,我们考虑了由节点网络根据本地获取的噪声和不确定的线性测量值对多个联合稀疏向量进行分散估计的问题。我们提出了一种迭代的,分散的贝叶斯算法,称为基于融合的分布式稀疏贝叶斯学习(FB-DSBL),该算法中的节点通过交换高度压缩的消息进行协作,以事先学习常见的联合稀疏性诱导信号。随后,每个节点将学习到的信号先验用于计算其各自的稀疏矢量的最大后验概率估计。由于节点间的通信成本昂贵,因此通过仅交换与经由多个复合对数似然比测试检测到的非零支持相关联的那些本地信号先验参数,可以大大减少节点之间交换的消息的大小。根据经验,平均消息大小显示为与未知向量的信息率成比例。所提出的基于稀疏贝叶斯学习(SBL)的分布式算法允许节点利用信号的底层联合稀疏性。相应地,与独立的SBL算法相比,这使节点能够以较少的测量数量来恢复稀疏矢量。所提出的算法被解释为用于寻找函数固定点的基于分布式共识的随机逼近算法的简并案例,并且还开发了具有Robbins-Monro型迭代的广义版本。使用蒙特卡洛模拟,我们证明了与具有相似或更高通信复杂度的现有分散算法相比,所提出的FB-DSBL具有出色的均方误差并支持恢复性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号