We introduce a novel approach for analyzing the performance of first-orderblack-box optimization methods. We focus on smooth unconstrained convexminimization over the Euclidean space $R^d$. Our approach relies on theobservation that by definition, the worst case behavior of a black-boxoptimization method is by itself an optimization problem, which we call thePerformance Estimation Problem (PEP). We formulate and analyze the PEP for twoclasses of first-order algorithms. We first apply this approach on theclassical gradient method and derive a new and tight analytical bound on itsperformance. We then consider a broader class of first-order black-box methods,which among others, include the so-called heavy-ball method and the fastgradient schemes. We show that for this broader class, it is possible to derivenew numerical bounds on the performance of these methods by solving anadequately relaxed convex semidefinite PEP. Finally, we show an efficientprocedure for finding optimal step sizes which results in a first-orderblack-box method that achieves best performance.
展开▼
机译:我们介绍了一种新颖的方法来分析一阶黑盒优化方法的性能。我们关注于欧氏空间$ R ^ d $上的光滑无约束凸最小化。我们的方法依赖于这样的观察:按照定义,黑盒优化方法的最坏情况行为本身就是一个优化问题,我们称其为性能估计问题(PEP)。我们为两类一阶算法制定并分析了PEP。我们首先将这种方法应用于经典梯度方法,并对其性能推导出一个新的严格的分析界限。然后,我们考虑一类更广泛的一阶黑箱方法,其中包括所谓的重球法和快速梯度法。我们表明,对于这一更广泛的类,可以通过求解充分松弛的凸半定PEP来得出这些方法的性能新的数值边界。最后,我们展示了一种用于找到最佳步长的有效程序,该程序可导致获得最佳性能的一阶黑匣子方法。
展开▼