We introduce a randomized algorithm for overdetermined linear least-squares regression. Given an arbitrary full-rank m × n matrix A with m ≥ n, any m × 1 vector b, and any positive real number ε, the procedure computes an n × 1 vector x such that x minimizes the Euclidean norm ‖Ax − b‖ to relative precision ε. The algorithm typically requires 𝒪((log(n)+log(1/ε))mn+n3) floating-point operations. This cost is less than the 𝒪(mn2) required by the classical schemes based on QR-decompositions or bidiagonalization. We present several numerical examples illustrating the performance of the algorithm.
展开▼
机译:我们介绍了一种用于过度确定线性最小二乘回归的随机算法。给定任意一个具有m≥n的全秩m×n矩阵A,任意m×1向量b和任意正实数ε,该过程将计算n×1向量x,以使x最小化欧几里德范数‖Ax− b ‖至相对精度ε。该算法通常需要𝒪((log(n)+ log(1 /ε))mn + n 3 sup>)浮点运算。此成本小于基于 QR em>分解或双向对角化的经典方案所需的𝒪(mn 2 sup>)。我们提供了几个数值示例来说明算法的性能。
展开▼