...
首页> 外文期刊>IEEE Transactions on Information Theory >Tensor SVD: Statistical and Computational Limits
【24h】

Tensor SVD: Statistical and Computational Limits

机译:Tensor SVD:统计和计算极限

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In this paper, we propose a general framework for tensor singular value decomposition (tensor singular value decomposition (SVD)), which focuses on the methodology and theory for extracting the hidden low-rank structure from high-dimensional tensor data. Comprehensive results are developed on both the statistical and computational limits for tensor SVD. This problem exhibits three different phases according to the signal-to-noise ratio (SNR). In particular, with strong SNR, we show that the classical higher-order orthogonal iteration achieves the minimax optimal rate of convergence in estimation; with weak SNR, the information-theoretical lower bound implies that it is impossible to have consistent estimation in general; with moderate SNR, we show that the non-convex maximum likelihood estimation provides optimal solution, but with NP-hard computational cost; moreover, under the hardness hypothesis of hypergraphic planted clique detection, there are no polynomial-time algorithms performing consistently in general.
机译:在本文中,我们提出了张量奇异值分解的通用框架(张量奇异值分解(SVD)),着重于从高维张量数据中提取隐藏的低秩结构的方法和理论。张量SVD的统计和计算极限都得到了综合结果。根据信噪比(SNR),此问题表现出三个不同的相位。尤其是,在信噪比(SNR)强的情况下,我们证明了经典的高阶正交迭代可实现估计中的最小最大最优收敛速度;在信噪比(SNR)较低的情况下,信息理论的下限意味着通常不可能进行一致的估计。在中等信噪比的情况下,我们表明非凸最大似然估计提供了最佳解决方案,但是具有NP难计算成本;此外,在超图植入集团检测的硬度假设下,一般没有多项式时间算法能够始终如一地执行。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号