首页> 外文会议>Conference on Neural Information Processing Systems >Regularized Weighted Low Rank Approximation
【24h】

Regularized Weighted Low Rank Approximation

机译:正则加权低等级近似

获取原文
获取外文期刊封面目录资料

摘要

The classical low rank approximation problem is to find a rank k matrix UV (where U has k columns and V has k rows) that minimizes the Frobenius norm of A - UV. Although this problem can be solved efficiently, we study an NP-hard variant of this problem that involves weights and regularization. A previous paper of [Razenshteyn et al.'16] derived a polynomial time algorithm for weighted low rank approximation with constant rank. We derive provably sharper guarantees for the regularized version by obtaining parameterized complexity bounds in terms of the statistical dimension rather than the rank, allowing for a rank-independent runtime that can be significantly faster. Our improvement comes from applying sharper matrix concentration bounds, using a novel conditioning technique, and proving structural theorems for regularized low rank problems.
机译:经典的低秩近似问题是找到排名k矩阵uv(其中U具有k列和v具有k行),这最小化了a - uv的Frobenius规范。 虽然这个问题可以有效地解决,但我们研究了这个问题的NP硬变体,涉及权重和正规化。 [Razenshteyn等人'16]的前一篇论文衍生一种具有恒定等级的加权低秩近似的多项式时间算法。 我们通过在统计维度而不是排名方面获取参数化复杂性界限,从而为正常化版本提供了可释放的保证。 我们的改进来自使用新颖的调节技术,并针对正规化低等级问题的结构定理来应用锐利矩阵集中界。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号