In this paper, we propose a version of the coordinate descent method for solving linear least-squares problems. At each iteration step, the new estimate is obtained by successive projection and orthogonalization onto a solution space given by two greedily selected columns. It is theoretically proved that this new method converges linearly to the unique solution of the least-squares problem when its coefficient matrix is of full column rank and overdetermined. Additionally, experimental results further confirm that our method has a faster convergence than the well-known greedy coordinate descent (GCD) and the two-step Gauss-Seidel (2SGS) methods, in particular when the coefficient matrix has highly coherent columns.
展开▼