首页> 外文会议>Advanced Topics in Optoelectronics, Microelectronics, and Nanotechnologies >On the performance of the variable-regularized recursive least-squares algorithms
【24h】

On the performance of the variable-regularized recursive least-squares algorithms

机译:关于可变正规递归最小二乘算法的性能

获取原文

摘要

The recursive least-squares (RLS) is a very popular adaptive algorithm, which is widely used in many system identification problems. The performance of the algorithm is controlled by two important parameters, i.e., the forgetting factor and the regularization parameter. The forgetting factor controls the "memory" of the algorithm and its value leads to a compromise between low misadjustment and fast convergence. The regularization term is required in most adaptive algorithms and its role becomes very critical in the presence of additive noise. In this paper, we present the regularized RLS algorithm and we develop a method to find its regularization parameter, which is related to the signal-to-noise ratio (SNR). Also, using a proper estimation of the SNR, we present a variable-regularized RLS (VR-RLS) algorithm.
机译:递归最小二乘(RLS)是一种非常流行的自适应算法,其广泛用于许多系统识别问题。算法的性能由两个重要参数,即遗忘因子和正则化参数控制。遗忘因子控制算法的“存储器”,其值导致低误诊和快速收敛之间的折衷。在大多数自适应算法中需要正则化术语,并且其作用在存在附加噪声时变得非常关键。在本文中,我们介绍了正则化的RLS算法,我们开发了一种找到其正则化参数的方法,该参数与信噪比(SNR)有关。此外,使用SNR的适当估计,我们呈现了一个变量正则化的RLS(VR-RLS)算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号