首页> 外文期刊>The Visual Computer >Efficient kernel discriminative common vectors for classification
【24h】

Efficient kernel discriminative common vectors for classification

机译:用于分类的有效内核判别通用向量

获取原文
获取原文并翻译 | 示例
           

摘要

Kernel discriminant analysis (KDA) which operates in the reproducing kernel Hilbert space (RKHS) is a very popular approach to dimensionality reduction. Kernel discriminative common vectors (KDCV) shares the same modified Fisher linear discriminant criterion with KDA and guarantees a 100 % recognition rate for the training set samples as well as favorable generalization performance. However, KDCV has the disadvantage of high computational complexity in both the training and the testing stage. This paper attempts to improve the computation efficiency of KDCV by two strategies. First, the Cholesky decomposition is introduced to obtain the projection matrix instead of eigen-decomposition. Second, we replace the matrix operation with vector operation in the testing process which reduces the computational complexity. Extensive experiments on COIL images dataset, ORL faces dataset, PIE faces dataset, and USPS handwritten digits dataset demonstrate that the proposed algorithm is more efficient than the traditional KDCV algorithm without loss of accuracy.
机译:在复制内核希尔伯特空间(RKHS)中运行的内核判别分析(KDA)是一种非常流行的降维方法。核判别通用向量(KDCV)与KDA共享相同的经过修改的Fisher线性判别准则,并保证训练集样本的识别率达到100%,并且具有良好的泛化性能。但是,KDCV在训练和测试阶段均具有较高的计算复杂性的缺点。本文试图通过两种策略来提高KDCV的计算效率。首先,引入Cholesky分解以获得投影矩阵而不是特征分解。其次,我们在测试过程中将矩阵运算替换为矢量运算,从而降低了计算复杂度。在COIL图像数据集,ORL人脸数据集,PIE人脸数据集和USPS手写数字数据集上的大量实验表明,所提出的算法比传统的KDCV算法更有效,并且没有损失准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号