首页> 中文期刊> 《清华大学学报(英文版)》 >Convergence Analysis of Forgetting Gradient Algorithm by Using Martingale Hyperconvergence Theorem

Convergence Analysis of Forgetting Gradient Algorithm by Using Martingale Hyperconvergence Theorem

         

摘要

The stochastic gradient (SG) algorithm has less of a computational burden than the least squares algorithms, but it can not track time-varying parameters and has a poor convergence rate. In order to improve the tracking properties of the SG algorithm, the forgetting gradient (FG) algorithm is presented, and its convergence is analyzed by using the martingale hyperconvergence theorem. The results show that: (1) for time-invariant deterministic systems, the parameter estimates given by the FG algorithm converge consistently to their true values; (2) for stochastic time-varying systems, the parameter tracking error is bounded, that is, the parameter tracking error is small when both the parameter change rate and the observation noise are small.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号