Dimension reduction plays an essential role when decreasing the complexity ofsolving large-scale problems. The well-known Johnson-Lindenstrauss (JL) Lemmaand Restricted Isometry Property (RIP) admit the use of random projection toreduce the dimension while keeping the Euclidean distance, which leads to theboom of Compressed Sensing and the field of sparsity related signal processing.Recently, successful applications of sparse models in computer vision andmachine learning have increasingly hinted that the underlying structure of highdimensional data looks more like a union of subspaces (UoS). In this paper,motivated by JL Lemma and an emerging field of Compressed Subspace Clustering(CSC), we study for the first time the RIP of Gaussian random matrices for thecompression of two subspaces based on the generalized projection $F$-normdistance. We theoretically prove that with high probability the affinity ordistance between two projected subspaces are concentrated around theirestimates. When the ambient dimension after projection is sufficiently large,the affinity and distance between two subspaces almost remain unchanged afterrandom projection. Numerical experiments verify the theoretical work.
展开▼
机译:在降低解决大规模问题的复杂性时,降维起着至关重要的作用。著名的Johnson-Lindenstrauss(JL)引理和受限等距特性(RIP)承认使用随机投影来缩小尺寸,同时保持欧几里得距离,这导致了压缩感的兴起以及与稀疏性相关的信号处理领域。稀疏模型在计算机视觉和机器学习中的成功应用越来越表明,高维数据的底层结构看起来更像是子空间的并集(UoS)。在JL Lemma和压缩子空间聚类(CSC)新兴领域的推动下,我们首次研究了基于广义投影$ F $ -normdistance的高斯随机矩阵的RIP对两个子空间的压缩。我们从理论上证明,两个投影子空间之间的亲和力或距离很可能集中在其估计值附近。当投影后的环境尺寸足够大时,随机投影后两个子空间之间的亲和力和距离几乎保持不变。数值实验验证了理论工作。
展开▼