...
首页> 外文期刊>Neural processing letters >Choosing Between Two Classification Learning Algorithms Based on Calibrated Balanced 5 x 2 Cross-Validated F-Test
【24h】

Choosing Between Two Classification Learning Algorithms Based on Calibrated Balanced 5 x 2 Cross-Validated F-Test

机译:在基于平衡5 x 2交叉验证F检验的两种分类学习算法之间进行选择

获取原文
获取原文并翻译 | 示例
           

摘要

5 x 2 cross-validated F-test based on independent five replications of 2-fold cross-validation is recommended in choosing between two classification learning algorithms. However, the reusing of the same data in a cross-validation causes the real degree of freedom (DOF) of the test to be lower than the F(10, 5) distribution given by (Neural Comput 11:1885-1892, [1]). This easily leads the test to suffer from high type I and type II errors. Random partitions for cross-validation result in difficulty in analyzing the DOF for the test. In particular, Wang et al. (Neural Comput 26(1):208-235, [2]) proposed a new blocked cross-validation, that considered the correlation between any two 2-fold cross-validations. Based on this, a calibrated balanced cross-validated F-test following F(7, 5) distribution is put forward in this study by calibrating the DOF for the F(10, 5) distribution. Simulated and real data studies demonstrate that the calibrated balanced cross-validated F-test has lower type I and type II errors than the cross-validated F-test following F(10, 5) in most cases.
机译:在两种分类学习算法之间进行选择时,建议基于2次交叉验证的独立5次重复的5 x 2交叉验证F检验。但是,在交叉验证中重复使用相同的数据会导致测试的实际自由度(DOF)低于(Neural Comput 11:1885-1892,[1 ])。这很容易导致测试遭受高I型和II型错误。用于交叉验证的随机分区导致难以分析测试的自由度。特别是,王等。 (Neural Comput 26(1):208-235,[2])提出了一种新的封闭式交叉验证,该交叉验证考虑了任意两个2倍交叉验证之间的相关性。基于此,在本研究中通过对F(10,5)分布的自由度进行校准,提出了遵循F(7,5)分布的校准平衡交叉验证F检验。模拟和真实数据研究表明,在大多数情况下,校准的平衡交叉验证F检验的I型和II型误差低于F(10,5)之后的交叉验证F检验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号