首页> 外文期刊>Applied mathematics and computation >Low-rank tensor train for tensor robust principal component analysis
【24h】

Low-rank tensor train for tensor robust principal component analysis

机译:用于张力稳健的主成分分析的低级张力列车

获取原文
获取原文并翻译 | 示例
           

摘要

Recently, tensor train rank, defined by a well-balanced matricization scheme, has been shown the powerful capacity to capture the hidden correlations among different modes of a tensor, leading to great success in tensor completion problem. Most of the high-dimensional data in the real world are more likely to be grossly corrupted with sparse noise. In this paper, based on tensor train rank, we consider a new model for tensor robust principal component analysis which aims to recover a low-rank tensor corrupted by sparse noise. The alternating direction method of multipliers algorithm is developed to solve the proposed model. A tensor augmentation tool called ket augmentation is used to convert lower-order tensors to higher-order tensors to enhance the performance of our method. Experiments of simulated data show the superiority of the proposed method in terms of PSNR and SSIM values. Moreover, experiments of the real rain streaks removal and the real stripe noise removal also illustrate the effectiveness of the proposed method. (C) 2019 Elsevier Inc. All rights reserved.
机译:最近,由平衡良好的探索方案定义的张量火车排名已经出现了捕捉张量不同模式之间的隐藏相关性的强大容量,导致张量完井问题取得了巨大的成功。现实世界中的大多数高维数据更有可能具有稀疏噪音严重损坏。本文基于张量火车排名,我们考虑了一种新模型,旨在恢复因稀疏噪声损坏的低级张量。开发了乘法器算法的交替方向方法来解决所提出的模型。称为KET增强的张量增强工具用于将低阶张量转换为高阶张量,以增强我们的方法的性能。模拟数据的实验显示了在PSNR和SSIM值方面所提出的方法的优越性。此外,去除实际雨条纹的实验和实际条纹噪声除去还说明了所提出的方法的有效性。 (c)2019 Elsevier Inc.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号