首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Efficient $l_{1}$ -Norm-Based Low-Rank Matrix Approximations for Large-Scale Problems Using Alternating Rectified Gradient Method
【24h】

Efficient $l_{1}$ -Norm-Based Low-Rank Matrix Approximations for Large-Scale Problems Using Alternating Rectified Gradient Method

机译:有效的 $ l_ {1} $ -基于范本的低秩矩阵逼近,用于交替使用的大规模问题校正梯度法

获取原文
获取原文并翻译 | 示例
           

摘要

Low-rank matrix approximation plays an important role in the area of computer vision and image processing. Most of the conventional low-rank matrix approximation methods are based on the -norm (Frobenius norm) with principal component analysis (PCA) being the most popular among them. However, this can give a poor approximation for data contaminated by outliers (including missing data), because the -norm exaggerates the negative effect of outliers. Recently, to overcome this problem, various methods based on the -norm, such as robust PCA methods, have been proposed for low-rank matrix approximation. Despite the robustness of the methods, they require heavy computational effort and substantial memory for high-dimensional data, which is impractical for real-world problems. In this paper, we propose two efficient low-rank factorization methods based on the -norm that find proper projection and coefficient matrices using the alternating rectified gradient method. The proposed methods are applied to a number of low-rank matrix approximation problems to demonstrate their efficiency and robustness. The experimental results show that our proposals are efficient in both execution time and reconstruction performance unlike other state-of-the-art methods.
机译:低秩矩阵逼近在计算机视觉和图像处理领域中起着重要作用。大多数常规的低秩矩阵逼近方法都是基于-norm(Frobenius范数)的,其中最流行的是主成分分析(PCA)。但是,由于-norm夸大了异常值的负面影响,因此对于离群值(包括丢失的数据)污染的数据可能无法给出近似的估计。最近,为了克服这个问题,已经提出了基于-norm的各种方法,例如鲁棒PCA方法,用于低秩矩阵逼近。尽管这些方法具有鲁棒性,但它们仍需要大量的计算工作并需要大量内存才能存储高维数据,这对于实际问题是不切实际的。在本文中,我们提出了两种基于-norm的有效低秩分解方法,它们使用交替整流梯度法找到合适的投影矩阵和系数矩阵。所提出的方法应用于许多低秩矩阵逼近问题,以证明其效率和鲁棒性。实验结果表明,与其他最新方法不同,我们的建议在执行时间和重构性能上都是有效的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号