首页> 外文期刊>Signal Processing, IEEE Transactions on >Greedy Sparsity-Promoting Algorithms for Distributed Learning
【24h】

Greedy Sparsity-Promoting Algorithms for Distributed Learning

机译:分布式学习的贪婪稀疏性促进算法

获取原文
获取原文并翻译 | 示例

摘要

This paper focuses on the development of novel greedy techniques for distributed learning under sparsity constraints. Greedy techniques have widely been used in centralized systems due to their low computational requirements and at the same time their relatively good performance in estimating sparse parameter vectors/signals. The paper reports two new algorithms in the context of sparsity-aware learning. In both cases, the goal is first to identify the support set of the unknown signal and then to estimate the nonzero values restricted to the active support set. First, an iterative greedy multistep procedure is developed, based on a neighborhood cooperation strategy, using batch processing on the observed data. Next, an extension of the algorithm to the online setting, based on the diffusion LMS rationale for adaptivity, is derived. Theoretical analysis of the algorithms is provided, where it is shown that the batch algorithm converges to the unknown vector if a Restricted Isometry Property (RIP) holds. Moreover, the online version converges in the mean to the solution vector under some general assumptions. Finally, the proposed schemes are tested against recently developed sparsity-promoting algorithms and their enhanced performance is verified via simulation examples.
机译:本文重点研究稀疏约束下用于分布式学习的新型贪婪技术。贪婪技术由于其较低的计算要求,同时在估计稀疏参数矢量/信号方面具有相对较好的性能,因此已广泛用于集中式系统中。本文在稀疏感知学习的背景下报告了两种新算法。在这两种情况下,目标都是首先确定未知信号的支持集,然后估计受限于活动支持集的非零值。首先,基于邻域合作策略,使用批处理对观察到的数据开发了一个贪婪迭代迭代步骤。接下来,基于对适应性的扩散LMS理论,得出了将算法扩展到在线设置的信息。提供了算法的理论分析,其中表明,如果保留了等距特性(RIP),则批处理算法收敛到未知向量。此外,在某些一般性假设下,在线版本的均值收敛于解向量。最后,针对最近开发的稀疏性促进算法对提出的方案进行了测试,并通过仿真示例验证了其增强的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号