首页> 外文期刊>Neural processing letters >Regularized Negative Label Relaxation Least Squares Regression for Face Recognition
【24h】

Regularized Negative Label Relaxation Least Squares Regression for Face Recognition

机译:正规化的负面标签放松最不正方形回归面部识别

获取原文
获取原文并翻译 | 示例

摘要

Least squares regression (LSR) is widely used for pattern classification. Some variants based on it try to enlarge the margin between different classes to achieve better performance. However, the large margin classifier doesn't work well when it deals with the complex applications in the real world, such as face recognition, where images are captured with different facial expressions, lighting conditions or background. To address this problem, we propose a regularized negative label relaxation least squares regression method with the following characteristics. First, we introduce a negative ε dragging technique to relax the strict binary label matrix into a slack label matrix, which has more freedom to fit the labels and reduces the class margins at the same time. Second, we introduce manifold learning and class compactness graph to devise a regularization item to preserve the intrinsic structure of data and avoid the problem of overrating. The class compactness graph can enable samples from the same class to be kept close together after they are transformed into the slack label space. The algorithm based on L2-norm loss function is devised. The experimental results show that our algorithm achieves better classification accuracy.
机译:最小二乘回归(LSR)广泛用于模式分类。一些基于IT的变体尝试扩大不同类之间的余量以实现更好的性能。然而,当涉及现实世界中的复杂应用程序,例如面部识别,其中图像被捕获,其中不同的面部表情,照明条件或背景。为了解决这个问题,我们提出了一种正则标签放宽最小二乘回归方法,具有以下特征。首先,我们引入了负ε拖动技术,将严格的二进制标签矩阵放在一个松弛标签矩阵中,这更自由地适合标签并同时减少类边距。其次,我们介绍了流形学习和类紧凑性图,设计了正则化项目以保持数据的内在结构,避免过度的问题。类紧凑性图可以使来自同一类的样本能够在转换到松弛标签空间后保持靠近。设计了基于L2-NOM损耗函数的算法。实验结果表明,我们的算法实现了更好的分类准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号