首页> 外文期刊>IEEE signal processing letters >A generalized normalized gradient descent algorithm
【24h】

A generalized normalized gradient descent algorithm

机译:广义归一化梯度下降算法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

A generalized normalized gradient descent (GNGD) algorithm for linear finite-impulse response (FIR) adaptive filters is introduced. The GNGD represents an extension of the normalized least mean square (NLMS) algorithm by means of an additional gradient adaptive term in the denominator of the learning rate of NLMS. This way, GNGD adapts its learning rate according to the dynamics of the input signal, with the additional adaptive term compensating for the simplifications in the derivation of NLMS. The performance of GNGD is bounded from below by the performance of the NLMS, whereas it converges in environments where NLMS diverges. The GNGD is shown to be robust to significant variations of initial values of its parameters. Simulations in the prediction setting support the analysis.
机译:介绍了一种用于线性有限冲激响应(FIR)自适应滤波器的广义归一化梯度下降算法。 GNGD表示归一化最小均方(NLMS)算法的一种扩展,它通过NLMS学习率分母中的一个附加梯度自适应项来实现。这样,GNGD会根据输入信号的动态变化来调整其学习速率,并使用附加的自适应项来补偿NLMS推导过程中的简化。 GNGD的性能从下面受NLMS的性能限制,而在NLMS分叉的环境中它会收敛。已显示GNGD对于其参数的初始值的显着变化具有鲁棒性。预测设置中的模拟支持分析。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号