...
首页> 外文期刊>IEEE Transactions on Neural Networks >A Kernel-Induced Space Selection Approach to Model Selection in KLDA
【24h】

A Kernel-Induced Space Selection Approach to Model Selection in KLDA

机译:KLDA模型选择的核诱导空间选择方法

获取原文
获取原文并翻译 | 示例
           

摘要

Model selection in kernel linear discriminant analysis (KLDA) refers to the selection of appropriate parameters of a kernel function and the regularizer. By following the principle of maximum information preservation, this paper formulates the model selection problem as a problem of selecting an optimal kernel-induced space in which different classes are maximally separated from each other. A scatter-matrix-based criterion is developed to measure the “goodness” of a kernel-induced space, and the kernel parameters are tuned by maximizing this criterion. This criterion is computationally efficient and is differentiable with respect to the kernel parameters. Compared with the leave-one-out (LOO) or $k$-fold cross validation (CV), the proposed approach can achieve a faster model selection, especially when the number of training samples is large or when many kernel parameters need to be tuned. To tune the regularization parameter in the KLDA, our criterion is used together with the method proposed by Saadi (2004). Experiments on benchmark data sets verify the effectiveness of this model selection approach.
机译:核线性判别分析(KLDA)中的模型选择是指选择核函数和正则化函数的适当参数。通过遵循最大信息保留的原理,本文将模型选择问题表述为选择一个最佳的核诱导空间的问题,在该空间中,不同类别之间的最大距离是最大的。制定了基于散布矩阵的标准来测量内核诱发空间的“优度”,并通过最大化该标准来调整内核参数。该标准在计算上是有效的,并且相对于内核参数是可区分的。与留一法(LOO)或$ -k-fold交叉验证(CV)相比,该方法可以实现更快的模型选择,尤其是在训练样本数量较大或需要大量内核参数的情况下调优。为了调整KLDA中的正则化参数,我们的标准与Saadi(2004)提出的方法一起使用。在基准数据集上进行的实验验证了该模型选择方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号