首页> 外文期刊>Optimization methods & software >Proximal gradient method with automatic selection of the parameter by automatic differentiation
【24h】

Proximal gradient method with automatic selection of the parameter by automatic differentiation

机译:近端梯度法通过自动分化自动选择参数

获取原文
获取原文并翻译 | 示例
           

摘要

A class of non-smooth convex optimization problems which arise naturally from applications in sparse group Lasso, have attracted significant research efforts for parameters selection. For given parameters of the problem, proximal gradient method (PGM) is effective to solve it with linear convergence rate and the closed form solution can be obtained at each iteration. However, in many practical applications, the selection of the parameters not only affects the quality of solution, but also even determines whether the solution is right or not. In this paper, we study a new method to analyse the impact of the parameters on PGM algorithm to solve the non-smooth convex optimization problem. We present the sensitivity analysis on the output of an optimization algorithm over parameter, and show the advantage of the technique using automatic differentiation. Then, we propose a hybrid algorithm for selecting the optimal parameter based on the method of PGM. The numerical results show that the proposed method is effective for the solving of sparse signal recovery problem.
机译:一类非平滑的凸优化问题,自然来自稀疏组套索的应用,吸引了参数选择的显着研究工作。对于问题的给定参数,近端梯度法(PGM)有效地用线性会聚速率解决,并且可以在每次迭代时获得闭合的形式溶液。但是,在许多实际应用中,选择参数不仅影响解决方案的质量,而且甚至确定解决方案是否正确。在本文中,我们研究了一种新方法来分析参数对PGM算法的影响,解决非平滑凸优化问题。我们介绍了对参数的优化算法输出的灵敏度分析,并显示了使用自动差异化技术的优势。然后,我们提出了一种用于基于PGM方法选择最佳参数的混合算法。数值结果表明,该方法对于求解稀疏信号恢复问题是有效的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号