...
首页> 外文期刊>IEEE Transactions on Signal Processing >Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization
【24h】

Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization

机译:非光滑非凸优化的不精确块坐标下降算法

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we propose an inexact block coordinate descent algorithm for large-scale nonsmooth nonconvex optimization problems. At each iteration, a particular block variable is selected and updated by inexactly solving the original optimization problem with respect to that block variable. More precisely, a local approximation of the original optimization problem is solved. The proposed algorithm has several attractive features, namely, i) high flexibility, as the approximation function only needs to be strictly convex and it does not have to be a global upper bound of the original function; ii) fast convergence, as the approximation function can be designed to exploit the problem structure at hand and the stepsize is calculated by the line search; iii) low complexity, as the approximation subproblems are much easier to solve and the line search scheme is carried out over a properly constructed differentiable function; iv) guaranteed convergence of a subsequence to a stationary point, even when the objective function does not have a Lipschitz continuous gradient. Interestingly, when the approximation subproblem is solved by a descent algorithm, convergence of a subsequence to a stationary point is still guaranteed even if the approximation subproblem is solved inexactly by terminating the descent algorithm after a finite number of iterations. These features make the proposed algorithm suitable for large-scale problems where the dimension exceeds the memory and/or the processing capability of the existing hardware. These features are also illustrated by several applications in signal processing and machine learning, for instance, network anomaly detection and phase retrieval. To promote reproducible research, the simulation code is available at
机译:本文针对大规模非光滑非凸优化问题提出了一种不精确的块坐标下降算法。在每次迭代中,通过不精确地解决关于该块变量的原始优化问题来选择和更新特定的块变量。更精确地,解决了原始优化问题的局部近似。所提出的算法具有几个吸引人的特征,即:i)高灵活性,因为逼近函数只需要严格凸,而不必是原始函数的全局上限; ii)快速收敛,因为可以设计近似函数来利用当前的问题结构,并且通过线搜索来计算步长; iii)低复杂度,因为近似子问题更容易解决,并且线搜索方案是在适当构造的可微函数上进行的; iv)即使目标函数没有Lipschitz连续梯度,也可以保证子序列收敛到固定点。有趣的是,当通过下降算法来解决近似子问题时,即使通过在有限次迭代后终止下降算法来不精确地解决了近似子问题,仍然可以保证子序列收敛到平稳点。这些特征使所提出的算法适合于尺寸超出现有硬件的存储器和/或处理能力的大规模问题。这些功能还通过信号处理和机器学习中的几种应用进行了说明,例如,网络异常检测和相位检索。为了促进可重复的研究,可在以下位置获得仿真代码

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号