...
首页> 外文期刊>SIGKDD explorations >Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms
【24h】

Auto-WEKA: Combined Selection and Hyperparameter Optimization of Classification Algorithms

机译:Auto-WEKA:分类算法的组合选择和超参数优化

获取原文
获取原文并翻译 | 示例

摘要

Many different machine learning algorithms exist; taking into account each algorithm's hyperparameters, there is a staggeringly large number of possible alternatives overall. We consider the problem of simultaneously selecting a learning algorithm and setting its hyperparameters, going beyond previous work that attacks these issues separately. We show that this problem can be addressed by a fully automated approach, leveraging recent innovations in Bayesian optimization. Specifically, we consider a wide range of feature selection techniques (combining 3 search and 8 evaluator methods) and all classification approaches implemented in WEKA's standard distribution, spanning 2 ensemble methods, 10 meta-methods, 27 base classifiers, and hyperparameter settings for each classifier. On each of 21 popular datasets from the UCI repository, the KDD Cup 09, variants of the MNIST dataset and CIFAR-10, we show classification performance often much better than using standard selection and hyperparameter optimization methods. We hope that our approach will help non-expert users to more effectively identify machine learning algorithms and hyperparameter settings appropriate to their applications, and hence to achieve improved performance.
机译:存在许多不同的机器学习算法。考虑到每种算法的超参数,总体上存在惊人的大量替代方案。我们考虑了同时选择一种学习算法并设置其超参数的问题,这超出了先前分别解决这些问题的工作。我们表明,可以利用贝叶斯优化中的最新创新,通过完全自动化的方法解决此问题。具体来说,我们考虑了广泛的特征选择技术(将3种搜索方法和8种评估方法结合在一起)以及在WEKA标准分布中实施的所有分类方法,跨越2种集成方法,10种元方法,27个基本分类器以及每个分类器的超参数设置。在UCI资料库,KDD Cup 09,MNIST数据集和CIFAR-10的变体的21个流行数据集中,我们显示的分类性能通常比使用标准选择和超参数优化方法要好得多。我们希望我们的方法将帮助非专家用户更有效地识别适合其应用程序的机器学习算法和超参数设置,从而实现更高的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号