首页> 外文学位 >Convex relaxation for low-dimensional representation: Phase transitions and limitations.
【24h】

Convex relaxation for low-dimensional representation: Phase transitions and limitations.

机译:低维表示的凸弛豫:相变和局限性。

获取原文
获取原文并翻译 | 示例

摘要

There is a growing interest in taking advantage of possible patterns and structures in data so as to extract the desired information and overcome the curse of dimensionality. In a wide range of applications, including computer vision, machine learning, medical imaging, and social networks, the signal that gives rise to the observations can be modeled to be approximately sparse and exploiting this fact can be very beneficial. This has led to an immense interest in the problem of efficiently reconstructing a sparse signal from limited linear observations. More recently, low-rank approximation techniques have become prominent tools to approach problems arising in machine learning, system identification and quantum tomography.;In sparse and low-rank estimation problems, the challenge is the inherent intractability of the objective function, and one needs efficient methods to capture the low-dimensionality of these models. Convex optimization is often a promising tool to attack such problems. An intractable problem with a combinatorial objective can often be "relaxed" to obtain a tractable but almost as powerful convex optimization problem. This dissertation studies convex optimization techniques that can take advantage of low-dimensional representations of the underlying high-dimensional data. We provide provable guarantees that ensure that the proposed algorithms will succeed under reasonable conditions, and answer questions of the following flavor: 1. For a given number of measurements, can we reliably estimate the true signal? 2. If so, how good is the reconstruction as a function of the model parameters?;More specifically, i) Focusing on linear inverse problems, we generalize the classical error bounds known for the least-squares technique to the lasso formulation, which incorporates the signal model. ii) We show that intuitive convex approaches do not perform as well as expected when it comes to signals that have multiple low-dimensional structures simultaneously. iii) Finally, we propose convex relaxations for the graph clustering problem and give sharp performance guarantees for a family of graphs arising from the so-called stochastic block model. We pay particular attention to the following aspects. For i) and ii), we aim to provide a general geometric framework, in which the results on sparse and low-rank estimation can be obtained as special cases. For i) and iii), we investigate the precise performance characterization, which yields the right constants in our bounds and the true dependence between the problem parameters.
机译:人们越来越感兴趣的是利用数据中可能的模式和结构来提取所需的信息并克服维数的诅咒。在包括计算机视觉,机器学习,医学成像和社交网络在内的广泛应用中,可以将引起观察的信号建模为近似稀疏的状态,利用这一事实可能会非常有益。这引起了人们对从有限的线性观测中有效地重建稀疏信号这一问题的极大兴趣。最近,低秩逼近技术已成为解决机器学习,系统识别和量子层析成像中出现的问题的重要工具。;在稀疏和低秩估计问题中,挑战是目标函数固有的难处理性,并且有一个需求捕获这些模型的低维的有效方法。凸优化通常是解决此类问题的有前途的工具。具有组合目标的棘手问题通常可以“放松”以获得棘手但几乎同样有效的凸优化问题。本文研究了可利用底层高维数据的低维表示的凸优化技术。我们提供可证明的保证,以确保所提出的算法在合理的条件下能够成功,并回答以下问题:1.对于给定的测量次数,我们能否可靠地估计真实信号? 2.如果是,则作为模型参数的函数的重构效果如何?;更具体地说,i)着眼于线性逆问题,我们将最小二乘技术已知的经典误差范围推广到套索公式,该公式包括信号模型。 ii)我们表明,当同时具有多个低维结构的信号时,直观的凸方法效果不如预期。 iii)最后,我们提出了图聚类问题的凸松弛法,并为由所谓的随机块模型产生的一系列图提供了清晰的性能保证。我们特别注意以下方面。对于i)和ii),我们旨在提供一个通用的几何框架,其中稀疏和低秩估计的结果可以作为特殊情况获得。对于i)和iii),我们研究了精确的性能表征,这可得出我们范围内正确的常数以及问题参数之间的真实依存关系。

著录项

  • 作者

    Oymak, Samet.;

  • 作者单位

    California Institute of Technology.;

  • 授予单位 California Institute of Technology.;
  • 学科 Engineering Electronics and Electrical.;Computer Science.;Applied Mathematics.
  • 学位 Ph.D.
  • 年度 2015
  • 页码 302 p.
  • 总页数 302
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号