...
首页> 外文期刊>International journal of computer mathematics >An inexact continuation accelerated proximal gradient algorithm for low n-rank tensor recovery
【24h】

An inexact continuation accelerated proximal gradient algorithm for low n-rank tensor recovery

机译:低n秩张量恢复的不精确连续加速近端梯度算法

获取原文
获取原文并翻译 | 示例
           

摘要

The low n-rank tensor recovery problem is an interesting extension of the compressed sensing. This problem consists of finding a tensor of minimum n-rank subject to linear equality constraints and has been proposed in many areas such as data mining, machine learning and computer vision. In this paper, operator splitting technique and convex relaxation technique are adapted to transform the low n-rank tensor recovery problem into a convex, unconstrained optimization problem, in which the objective function is the sum of a convex smooth function with Lipschitz continuous gradient and a convex function on a set of matrices. Furthermore, in order to solve the unconstrained nonsmooth convex optimization problem, an accelerated proximal gradient algorithm is proposed. Then, some computational techniques are used to improve the algorithm. At the end of this paper, some preliminary numerical results demonstrate the potential value and application of the tensor as well as the efficiency of the proposed algorithm.
机译:低n-rank张量恢复问题是压缩感测的有趣扩展。这个问题包括找到一个受线性等式约束的最小n-rank张量,并且已在许多领域中提出,例如数据挖掘,机器学习和计算机视觉。本文采用算子分裂技术和凸松弛技术将低n阶张量恢复问题转化为凸无约束优化问题,其中目标函数是Lipschitz连续梯度的凸光滑函数与a的和。一组矩阵上的凸函数。此外,为解决无约束的非光滑凸优化问题,提出了一种加速的近端梯度算法。然后,使用一些计算技术来改进算法。在本文的最后,一些初步的数值结果证明了张量的潜在价值和应用以及所提算法的效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号