首页> 外文会议>International Conference on Pattern Recognition >Improved Robust Discriminant Analysis for Feature Extraction
【24h】

Improved Robust Discriminant Analysis for Feature Extraction

机译:改进的鲁棒判别分析用于特征提取

获取原文

摘要

Dimensionality reduction (DR) has emerged as a crucial issue in developing effective pattern recognition approaches. However, the performance of many DR algorithms degrades due to the impact of noisy environment. To address this problem, we propose in this paper a novel algorithm termed as robust large margin discriminant analysis (RLMDA) in order to enhance the robustness of features. In the spirit of large margin principle as applied in support vector machine, RLMDA maximizes the minimum between-class dispersion and simultaneously minimizes the within-class dispersion in the reduced subspace. Moreover, the l1-norm rather than traditional squared l2-norm is exploited to describe such dispersions, making the resultant algorithm robust to noisy features. The solution of RLMDA boils down to a nonconvex and nonsmooth optimization problem. Therefore, we take advantages of both constrained concave-convex procedure (CCCP) and Lagrangian dual method, and develop an efficient iterative algorithm. Experimental results show that RLMDA achieves better performance compared with other related methods.
机译:降维(DR)已成为开发有效模式识别方法的关键问题。但是,由于噪声环境的影响,许多DR算法的性能都会下降。为了解决这个问题,我们在本文中提出了一种新的算法,称为鲁棒大边际判别分析(RLMDA),以增强特征的鲁棒性。在支持向量机中应用大余量原理的精神下,RLMDA在缩小的子空间中最大化了类间的最小色散,同时最小化了类内的色散。而且,l 1 -范数,而不是传统的平方 2 利用-norm来描述这种离散,使所得算法对嘈杂的特征具有鲁棒性。 RLMDA的解决方案归结为一个非凸且非平滑的优化问题。因此,我们充分利用了约束凹凸程序(CCCP)和拉格朗日对偶方法的优势,并开发了一种高效的迭代算法。实验结果表明,与其他相关方法相比,RLMDA具有更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号