...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >Hyper-parameter optimization in classification: To-do or not-to-do
【24h】

Hyper-parameter optimization in classification: To-do or not-to-do

机译:分类中的超参数优化:要做或不做

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Hyper-parameter optimization is a process to find suitable hyper-parameters for predictive models. It typically incurs highly demanding computational costs due to the need of the time-consuming model training process to determine the effectiveness of each set of candidate hyper-parameter values. A priori, there is no guarantee that hyper-parameter optimization leads to improved performance. In this work, we propose a framework to address the problem of whether one should apply hyper-parameter optimization or use the default hyper-parameter settings for traditional classification algorithms. We implemented a prototype of the framework, which we use a basis for a three-fold evaluation with 486 datasets and 4 algorithms. The results indicate that our framework is effective at supporting modeling tasks in avoiding adverse effects of using ineffective optimizations. The results also demonstrate that incrementally adding training datasets improves the predictive performance of framework instantiations and hence enables "life-long learning." (C) 2020 Elsevier Ltd. All rights reserved.
机译:超参数优化是一个用于查找适合预测模型的超参数的过程。它通常由于需要耗时的模型培训过程而导致高苛刻的计算成本,以确定每组候选超参数值的有效性。先验,无法保证超参数优化导致提高性能。在这项工作中,我们提出了一个框架来解决应该应用超参数优化或使用传统分类算法的默认超参数设置的问题。我们实施了框架的原型,我们使用486个数据集和4个算法来使用基础进行三倍评估。结果表明,我们的框架在支持建模任务方面有效,以避免使用无效优化的不利影响。结果还表明,逐步添加的培训数据集提高了框架实例化的预测性能,因此实现了“终身学习”。 (c)2020 elestvier有限公司保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号