Recursive least square (RLS) estimations are used extensively in many signal processing and control applications. The least squares estimator w(t) can be found by solving a linear matrix system A(t)w(t) $EQ d(t) at each adaptive time step t. In this paper, we consider block RLS computations. Our approach is to employ Galerkin projection methods to solve the linear systems. The method generates a Krylov subspace from a set of direction vectors obtained by solving one of the systems and then projects the residuals of other systems orthogonally onto the generated Krylov subspace to get the approximate solutions. The whole process is repeated until all the systems are solved. Both the exponential data weighting infinite memory method and finite memory sliding data window method are used to formulate the equations. In order to speed up the convergence rate of the method, FFT-based preconditioners are also employed. Numerical results are reported to illustrate the effectiveness of the Galerkin projection method for RLS computations.
展开▼