首页> 外文OA文献 >Zero attracting recursive least squares algorithms
【2h】

Zero attracting recursive least squares algorithms

机译:零吸引递归最小二乘算法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The l1-norm sparsity constraint is a widely usedudtechnique for constructing sparse models. In this contribution, two zero-attracting recursive least squares algorithms, referred to as ZA-RLS-I and ZA-RLS-II, are derived by employing the l1-norm of parameter vector constraint to facilitate the model sparsity. In order to achieve a closed-form solution, the l1-norm of the parameter vector is approximated by an adaptively weighted l2-norm, in which the weighting factors are set as the inversion of the associated l1-norm of parameter estimates that are readily available in the adaptive learning environment. ZA-RLS-II is computationally more efficient than ZA-RLS-I by exploiting the known results from linear algebra as well as the sparsity of theudsystem. The proposed algorithms are proven to converge, and adaptive sparse channel estimation is used to demonstrate the effectiveness of the proposed approach.
机译:l1-范数稀疏约束是构建稀疏模型的一种广泛使用的技术。在此贡献中,通过采用参数向量约束的l1-范数来促进模型稀疏性,得出了两个零吸引递归最小二乘算法,分别称为ZA-RLS-I和ZA-RLS-II。为了获得封闭形式的解决方案,参数矢量的l1-范数通过自适应加权的l2-范数来近似,其中将加权因子设置为容易进行参数估计的相关l1-范数的反演。在自适应学习环境中可用。通过利用线性代数的已知结果以及 udsystem的稀疏性,ZA-RLS-II在计算上比ZA-RLS-I更有效。所提出的算法被证明可以收敛,并且使用自适应稀疏信道估计来证明所提出方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号