【24h】

Efficient Cross-Validation of Kernel Fisher Discriminant Classifiers

机译:费舍尔判别分类器的有效交叉验证

获取原文
获取原文并翻译 | 示例

摘要

Mika et al. introduce a non-linear formulation of the Fisher discriminant based the well-known "kernel trick", later shown to be equivalent to the Least-Squares Support Vector Machine. In this paper, we show that the cross-validation error can be computed very efficiently for this class of kernel machine, specifically that leave-one-out cross-validation can be performed with a computational complexity of only O(l~3) operations (the same as that of the basic training algorithm), rather than the O(l~4) of a direct implementation. This makes leave-one-out cross-validation a practical proposition for model selection in much larger scale applications of KFD classifiers.
机译:Mika等。引入基于众所周知的“内核技巧”的费舍尔判别式的非线性表示法,此后表示等效于最小二乘支持向量机。在本文中,我们证明了对于此类内核机器,交叉验证错误可以非常有效地进行计算,尤其是留一法交叉验证可以以仅O(l〜3)运算的计算复杂度执行(与基本训练算法相同),而不是直接实现的O(l〜4)。这使得一劳永逸的交叉验证成为KFD分类器在更大规模应用中进行模型选择的实用命题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号