首页> 外文会议>IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing >A semismooth Newton method for adaptive distributed sparse linear regression
【24h】

A semismooth Newton method for adaptive distributed sparse linear regression

机译:自适应分布式稀疏线性回归的半光滑牛顿法

获取原文

摘要

The presented work studies an application of a technique known as a semismooth Newton (SSN) method to accelerate the convergence of distributed quadratic programming LASSO (DQP-LASSO) - a consensus-based distributed sparse linear regression algorithm. The DQP-LASSO algorithm exploits an alternating directions method of multipliers (ADMM) algorithm to reduce a global LASSO problem to a series of local (per agent) LASSO optimizations, which outcomes are then appropriately combined. The SSN algorithm enjoys superlinear convergence and thus permits implementing these local optimizations more efficiently. Yet in some cases SSN might experience convergence issues. Here it is shown that the ADMM-inherent regularization also provides sufficient regularization to stabilize the SSN algorithm, thus ensuring a stable convergence of the whole scheme. Additionally, the structure of the SSN algorithm also permits an adaptive implementation of a distributed sparse regression. This allows for an estimation of time-varying sparse vectors, as well as leverages storage requirements for processing streams of data.
机译:提出的工作研究了一种称为半光滑牛顿(SSN)方法的技术的应用,以加速分布式二次规划LASSO(DQP-LASSO)的收敛-一种基于共识的分布式稀疏线性回归算法。 DQP-LASSO算法利用乘数交替方向(ADMM)算法将全局LASSO问题减少为一系列局部(按每个代理)LASSO优化,然后将这些结果适当组合。 SSN算法具有超线性收敛性,因此可以更有效地实现这些局部优化。但是在某些情况下,SSN可能会遇到收敛问题。此处显示,ADMM固有的正则化还提供了足够的正则化来稳定SSN算法,从而确保了整个方案的稳定收敛。另外,SSN算法的结构还允许分布式稀疏回归的自适应实现。这样可以估算随时间变化的稀疏矢量,并利用存储要求来处理数据流。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号