...
首页> 外文期刊>SIAM Journal on Optimization: A Publication of the Society for Industrial and Applied Mathematics >Convergence of the gradient sampling algorithm for nonsmooth nonconvex optimization
【24h】

Convergence of the gradient sampling algorithm for nonsmooth nonconvex optimization

机译:非光滑非凸优化的梯度采样算法的收敛性

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

We study the gradient sampling algorithm of Burke, Lewis, and Overton for minimizing a locally Lipschitz function f on R-n that is continuously differentiable on an open dense subset. We strengthen the existing convergence results for this algorithm and introduce a slightly revised version for which stronger results are established without requiring compactness of the level sets of f. In particular, we show that with probability 1 the revised algorithm either drives the f- values to -infinity, or each of its cluster points is Clarke stationary for f. We also consider a simplified variant in which the differentiability check is skipped and the user can control the number of f- evaluations per iteration.
机译:我们研究了Burke,Lewis和Overton的梯度采样算法,以最小化R-n上的局部Lipschitz函数f,该函数在开放密集子集上是连续可微的。我们加强了该算法的现有收敛结果,并引入了一个稍微修改的版本,对于该版本,可以建立更强的结果,而无需紧缩f级集。特别是,我们表明,以概率1修改后的算法将f-值驱动至-infinity,或者其每个聚类点对于f而言都是Clarke平稳的。我们还考虑了一个简化的变体,其中跳过了差异检查,用户可以控制每次迭代的f评估次数。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号