首页> 外文期刊>Information Sciences: An International Journal >Learning distance to subspace for the nearest subspace methods in high-dimensional data classification
【24h】

Learning distance to subspace for the nearest subspace methods in high-dimensional data classification

机译:高维数据分类中最近的子空间方法学习距离

获取原文
获取原文并翻译 | 示例
       

摘要

The nearest subspace methods (NSM) are a category of classification methods widely applied to classify high-dimensional data. In this paper, we propose to improve the classification performance of NSM through learning tailored distance metrics from samples to class subspaces. The learned distance metric is termed as 'learned distance to subspace' (LD2S). Using LD2S in the classification rule of NSM can make the samples closer to their correct class subspaces while farther away from their wrong class subspaces. In this way, the classification task becomes easier and the classification performance of NSM can be improved. The superior classification performance of using LD2S for NSM is demonstrated on three real-world high-dimensional spectral datasets. (C) 2018 Elsevier Inc. All rights reserved.
机译:最近的子空间方法(NSM)是广泛应用于对高维数据进行分类的分类方法类别。 在本文中,我们建议通过从样本到类子空间的样本定制距离指标来提高NSM的分类性能。 学习距离度量被称为“到子空间的学习距离”(LD2S)。 在NSM的分类规则中使用LD2S可以使样本更靠近其正确的类子空间,同时远离错误的类子空间。 以这种方式,分类任务变得更容易,可以提高NSM的分类性能。 在三个真实的高维光谱数据集上演示了使用LD2S的卓越分类性能。 (c)2018年Elsevier Inc.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号