首页> 中文期刊>应用数学年刊:英文版 >Gradient Descent for Symmetric Tensor Decomposition

Gradient Descent for Symmetric Tensor Decomposition

     

摘要

Symmetric tensor decomposition is of great importance in applications.Several studies have employed a greedy approach,where the main idea is to first find a best rank-one approximation of a given tensor,and then repeat the process to the residual tensor by subtracting the rank-one component.In this paper,we focus on finding a best rank-one approximation of a given orthogonally order-3 symmetric tensor.We give a geometric landscape analysis of a nonconvex optimization for the best rank-one approximation of orthogonally symmetric tensors.We show that any local minimizer must be a factor in this orthogonally symmetric tensor decomposition,and any other critical points are linear combinations of the factors.Then,we propose a gradient descent algorithm with a carefully designed initialization to solve this nonconvex optimization problem,and we prove that the algorithm converges to the global minimum with high probability for orthogonal decomposable tensors.This result,combined with the landscape analysis,reveals that the greedy algorithm will get the tensor CP low-rank decomposition.Numerical results are provided to verify our theoretical results.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号