首页> 外文期刊>Neurocomputing >Low-rank tensor completion via combined non-local self-similarity and low-rank regularization
【24h】

Low-rank tensor completion via combined non-local self-similarity and low-rank regularization

机译:通过结合非局部自相似和低秩正则化实现低秩张量完成

获取原文
获取原文并翻译 | 示例
       

摘要

Global low-rank methods have achieved great successes in tensor completion. However, these methods neglected the abundant non-local self-similarities, which exist in a wide range of multi-dimensional imaging data. To integrate the global and non-local property of the underlying tensor, we propose a novel low-rank tensor completion model via combined non-local self-similarity and low-rank regularization, which is named as NLS-LR. We adopt the parallel low-rank matrix factorization to guarantee the global low-rankness while plugging in non-local based denoisers to promote the non-local self-similarity instead of tailoring regularizers. To tackle the proposed model, we develop an efficient block successive upper-bound minimization (BSUM) based algorithm. Numerical experiment results demonstrate that the proposed method outperforms many state-of-the-art tensor completion methods in terms of quality metrics and visual effects. (C) 2019 Elsevier B.V. All rights reserved.
机译:全局低秩方法在张量完成方面取得了巨大成功。然而,这些方法忽略了广泛的多维图像数据中存在的丰富的非局部自相似性。为了整合基础张量的全局和非局部特性,我们通过结合非局部自相似性和低秩正则化提出了一种新颖的低秩张量完成模型,称为NLS-LR。我们采用并行低秩矩阵分解来保证全局低秩,同时插入基于非局部的去噪器以促进非局部自相似性,而不是定制正则化器。为了解决提出的模型,我们开发了一种有效的基于块连续上限最小化(BSUM)的算法。数值实验结果表明,该方法在质量指标和视觉效果方面均优于许多最新的张量完成方法。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号