首页> 外文学位 >A coordinate gradient descent method for structured nonsmooth optimization.
【24h】

A coordinate gradient descent method for structured nonsmooth optimization.

机译:用于结构化非光滑优化的坐标梯度下降方法。

获取原文
获取原文并翻译 | 示例

摘要

We consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with ℓ1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a (block) coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. The local Lipschitzian error bound holds under assumptions analogous to those for constrained smooth optimization, e.g., the convex function is polyhedral and the smooth function is (nonconvex) quadratic or is the composition of a strongly convex function with a linear mapping. We report our numerical experience with solving the ℓ 1-regularization of unconstrained optimization problems from More et al. [73] and from the CUTEr set [38] and some large-scale quadratic program of support vector machines arising from two-class data classification. Comparison with L-BFGS-B and MINOS, applied to a reformulation of the ℓ1-regularized problem as a bound-constrained smooth optimization problem, is reported. Comparison with LIBSVM on large-scale quadratic programming problems of support vector machines is also reported.; In addition, we consider the bi-level problem which minimizes a nonsmooth convex function over the set of stationary points of a certain smooth function. If the smooth function is convex, the convex function is proper, level-bounded, lower semi-continuous, and the set of stationary points of the smooth function over the domain of the convex function is nonempty, a regularization strategy for solving this bi-level problem is proposed for a (block) coordinate gradient descent method. We prove that any cluster point of the generated iterates is a solution of the bi-level problem.
机译:我们考虑使具有或不具有线性约束的平滑函数和(块可分离的)凸函数之和最小的问题。在特殊情况下,此问题包括:约束约束优化,带有& 1-规则化的平滑优化以及线性约束平滑优化,例如在支持向量机训练中出现的大规模二次规划问题。我们提出了一种(块)坐标梯度下降方法来解决此类结构化非光滑问题。该方法简单,高度可并行化,适用于信号/图像去噪,回归和数据挖掘/分类的大规模应用。我们建立了全局收敛性,并在局部Lipschitzian误差界假设下建立了该方法的局部线性收敛速度。局部Lipschitzian误差界限的假设类似于约束光滑优化的假设,例如,凸函数是多面体的,且平滑函数是(非凸)二次的,或者是具有线性映射的强凸函数的组成。我们报告了解决ℓ的数值经验。 1-无约束优化问题的正则化。 [73]和来自CUTEr集[38]以及由两类数据分类产生的支持向量机的一些大规模二次程序。报道了与L-BFGS-B和MINOS的比较,该比较应用于作为约束约束的平滑优化问题的ℓ 1正则化问题的重新表述。还报道了与LIBSVM在支持向量机的大规模二次编程问题上的比较。另外,我们考虑了双层问题,该问题使某个光滑函数的固定点集上的非光滑凸函数最小化。如果平滑函数是凸函数,则该凸函数是适当的,有界的,下半连续的,并且该凸函数的范围内的平滑函数的固定点集为非空的,用于解决该二元问题的正则化策略针对(块)坐标梯度下降方法提出了一个水平问题。我们证明生成的迭代的任何聚类点都是双层问题的解决方案。

著录项

  • 作者

    Yun, Sangwoon.;

  • 作者单位

    University of Washington.;

  • 授予单位 University of Washington.;
  • 学科 Mathematics.
  • 学位 Ph.D.
  • 年度 2007
  • 页码 134 p.
  • 总页数 134
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 数学;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号