【24h】

An Empirical Study on Regression Algorithms

机译:回归算法的实证研究

获取原文
获取外文期刊封面目录资料

摘要

Regression is one of the most important tasks in the machine learning and data mining. In recent years, a number of learning algorithms for regression have been introduced. Unfortunately, there are few projects that compared the performance of these methods by experiment In this paper, we present a empirical comparison between ten algorithms used for regression: k-nearest neighbor (KNN), it-nearest-neighbor with distance weighted (KNNDW), linear regression (LR), locally weighted linear regression (LWLR), model trees (M5P), reduced error pruning tree (REPTree), naive Bayes (NB), genetic programming (GP), back propagation (BP), and support vector machine (SVM). We test these ten different regression algorithms on the whole 36 data sets recommended by Weka. The ten-folds cross validation (CV) criteria is used to compare these algorithms. Finally we test the experimental results via two-tailed t-test with significantly different probability of 95% to compare the performance of each pair of algorithms in terms of the relative absolute error.
机译:回归是机器学习和数据挖掘中最重要的任务之一。近年来,已经引入了许多用于回归的学习算法。不幸的是,很少有项目通过实验来比较这些方法的性能。在本文中,我们对用于回归的十种算法进行了实证比较:k最近邻居(KNN),具有距离加权的it最近邻居(KNNDW) ,线性回归(LR),局部加权线性回归(LWLR),模型树(M5P),减少错误的修剪树(REPTree),朴素贝叶斯(NB),遗传编程(GP),反向传播(BP)和支持向量机器(SVM)。我们在Weka推荐的全部36个数据集上测试了这十种不同的回归算法。十倍交叉验证(CV)标准用于比较这些算法。最后,我们通过双尾t检验对实验结果进行了测试,其概率差异为95%,以相对绝对误差比较每对算法的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号