首页> 外文期刊>Mathematics of operations research >Extrapolated Proximal Subgradient Algorithms for Nonconvex and Nonsmooth Fractional Programs
【24h】

Extrapolated Proximal Subgradient Algorithms for Nonconvex and Nonsmooth Fractional Programs

机译:Extrapolated Proximal Subgradient Algorithms for Nonconvex and Nonsmooth Fractional Programs

获取原文
获取原文并翻译 | 示例
       

摘要

In this paper, we consider a broad class of nonsmooth and nonconvex fractional programs, which encompass many important modern optimization problems arising from diverse areas such as the recently proposed scale-invariant sparse signal reconstruction problem in signal processing.We propose a proximal subgradient algorithm with extrapolations for solving this optimization model and show that the iterated sequence generated by the algorithm is bounded and that any one of its limit points is a stationary point of the model problem. The choice of our extrapolation parameter is flexible and includes the popular extrapolation parameter adopted in the restarted fast iterative shrinking-threshold algorithm (FISTA). By providing a unified analysis framework of descent methods, we establish the convergence of the full sequence under the assumption that a suitable merit function satisfies the Kurdyka-?ojasiewicz property. Our algorithm exhibits linear convergence for the scale-invariant sparse signal reconstruction problem and the Rayleigh quotient problem over spherical constraint.When the denominator is the maximum of finitely many continuously differentiable weakly convex functions, we also propose another extrapolated proximal subgradient algorithm with guaranteed convergence to a stronger notion of stationary points of the model problem. Finally, we illustrate the proposed methods by both analytical and simulated numerical examples.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号