首页> 外文会议>Data Mining Workshops, 2009. ICDMW '09 >Nonsmooth Bilevel Programming for Hyperparameter Selection
【24h】

Nonsmooth Bilevel Programming for Hyperparameter Selection

机译:用于超参数选择的非光滑双层编程

获取原文

摘要

We propose a nonsmooth bilevel programming method for training linear learning models with hyperparameters optimized via $T$-fold cross-validation (CV). This algorithm scales well in the sample size. The method handles loss functions with embedded maxima such as in support vector machines. Current practice constructs models over a predefined grid of hyperparameter combinations and selects the best one, an inefficient heuristic. Innovating over previous bilevel CV approaches, this paper represents an advance towards the goal of self-tuning supervised data mining as well as a significant innovation in scalable bilevel programming algorithms. Using the bilevel CV formulation, the lower-level problems are treated as unconstrained optimization problems and are replaced with their optimality conditions. The resulting nonlinear program is nonsmooth and nonconvex. We develop a novel bilevel programming algorithm to solve this class of problems, and apply it to linear least-squares support vector regression having hyperparameters $C$ (tradeoff) and $epsilon$ (loss insensitivity). This new approach outperforms grid search and prior smooth bilevel CV methods in terms of modeling performance. Increased speed foresees modeling with an increased number of hyperparameters.
机译:我们提出了一种非光滑的双层编程方法,用于训练具有通过$ T $ -fold交叉验证(CV)优化的超参数的线性学习模型。该算法可在样本大小上很好地缩放。该方法处理具有嵌入最大值的损失函数,例如在支持向量机中。当前的实践是在超参数组合的预定义网格上构建模型,并选择最佳的,低效的启发式方法。与以前的双层CV方法相比,本文代表了朝着自我调整监督数据挖掘目标迈进的一步,也是可扩展的双层编程算法中的一项重大创新。使用双层CV公式,将较低级的问题视为无约束的优化问题,并用其最优性条件代替。产生的非线性程序是非光滑且非凸的。我们开发了一种新颖的双层编程算法来解决此类问题,并将其应用于具有超参数$ C $(权衡)和$ epsilon $(损失不敏感性)的线性最小二乘支持向量回归。就建模性能而言,这种新方法优于网格搜索和先前的平滑双层CV方法。更高的速度可以预见建模将具有更多的超参数。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号