首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Distributed recursive least-squares with data-adaptive censoring
【24h】

Distributed recursive least-squares with data-adaptive censoring

机译:具有数据自适应检查的分布式递归最小二乘

获取原文

摘要

The deluge of networked big data motivates the development of computation- and communication-efficient network information processing algorithms. In this paper, we propose two data-adaptive censoring strategies that significantly reduce the computation and communication costs of the distributed recursive least-squares (D-RLS) algorithm. Through introducing a cost function that underrates the importance of those observations with small innovations, we develop the first censoring strategy based on the alternating minimization algorithm and the stochastic Newton method. It saves computation when a datum is censored. The computation and communication costs are further reduced by the second censoring strategy, which prohibits a node updating and transmitting its local estimate to neighbors when its current innovation is less than a threshold. For both strategies, a simple criterion for selecting the threshold of innovation is given so as to reach a target ratio of data reduction. The proposed censored D-RLS algorithms guarantee convergence to the optimal argument in the mean-square deviation sense. Numerical experiments validate the effectiveness of the proposed algorithms.
机译:网络大数据的泛滥推动了计算和通信效率高的网络信息处理算法的发展。在本文中,我们提出了两种数据自适应检查策略,它们可以显着降低分布式递归最小二乘(D-RLS)算法的计算和通信成本。通过引入成本函数,通过小的创新低估了这些观测的重要性,我们开发了基于交替最小化算法和随机牛顿法的第一个删失策略。审查基准时,它可以节省计算量。第二种检查策略进一步减少了计算和通信成本,该策略禁止节点更新其本地估计值,并在其当前创新程度小于阈值时将其本地估计值发送给邻居。对于这两种策略,都给出了选择创新阈值的简单标准,以便达到数据缩减的目标比例。所提出的删失D-RLS算法在均方差意义上保证了收敛到最优参数。数值实验验证了所提算法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号