首页> 外文期刊>Neurocomputing >Robust hierarchical feature selection with a capped ℓ_2-norm
【24h】

Robust hierarchical feature selection with a capped ℓ_2-norm

机译:强大的分层功能选择,具有封端的ℓ_2-rang

获取原文
获取原文并翻译 | 示例
           

摘要

Feature selection methods face new challenges in large-scale classification tasks because massive cate-gories are managed in a hierarchical structure. Hierarchical feature selection can take full advantage of the dependencies among hierarchically structured classes. However, most of the existing hierarchical fea-ture selection methods are not robust for dealing with the inevitable data outliers, resulting in a serious inter-level error propagation problem in the following classification process. In this paper, we propose a robust hierarchical feature selection method with a capped l(2)-norm (HFSCN), which can reduce the adverse effects of data outliers and learn relatively robust and discriminative feature subsets for the hier-archical classification process. Firstly, a large-scale global classification task is split into several small local sub-classification tasks according to the hierarchical class structure and the divide-and-conquer strategy, which makes it easy for feature selection modeling. Secondly, a capped l(2)-norm based loss func-tion is used in the feature selection process of each local sub-classification task to eliminate the data out-liers, which can alleviate the negative effects outliers and improve the robustness of the learned feature weighted matrix. Finally, an inter-level relation constraint based on the similarity between the parent and child classes is added to the feature selection model, which can enhance the discriminative ability of the selected feature subset for each sub-classification task with the learned robust feature weighted matrix. Compared with seven traditional and state-of-art hierarchical feature selection methods, the superior performance of HFSCN is verified on 16 real and synthetic datasets. (C) 2021 Elsevier B.V. All rights reserved.
机译:特征选择方法面临着大规模分类任务中的新挑战,因为大规模的Cate-GOITION在分层结构中管理。分层特征选择可以充分利用分层结构化类之间的依赖关系。但是,大多数现有的分层FEA-Ture选择方法对于处理不可避免的数据异常值并不强大,导致以下分类过程中严重的级别帧间错误传播问题。在本文中,我们提出了一种具有CAPPED L(2)-NORM(HFSCN)的强大的分层特征选择方法,其可以降低数据异常值的不利影响,并为欣嘉的分类过程学习相对稳健和识别的特征子集。首先,大规模的全局分类任务根据分层类结构和划分和征服策略分为几个小型本地子分类任务,这使得特征选择建模简单。其次,在每个本地子分类任务的特征选择过程中使用基于封端的L(2)-norm基于损耗功能,以消除数据输出,这可以缓解负面影响异常值并提高鲁棒性学习功能加权矩阵。最后,基于父类和子类之间的相似性的级别关系约束被添加到特征选择模型,该特征选择模型可以增强具有学习鲁棒特征加权矩阵的每个子分类任务的所选特征子集的判别能力。与七种传统和最先进的分层特征选择方法相比,HFSCN的卓越性能在16个实际和合成的数据集中验证。 (c)2021 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2021年第5期|131-146|共16页
  • 作者

    Liu Xinxin; Zhao Hong;

  • 作者单位

    Minnan Normal Univ Sch Comp Sci Zhangzhou 363000 Fujian Peoples R China|Fujian Prov Univ Key Lab Data Sci & Intelligence Applicat Zhangzhou 363000 Fujian Peoples R China|Minnan Normal Univ Fujian Key Lab Granular Comp & Applicat Zhangzhou 363000 Fujian Peoples R China;

    Minnan Normal Univ Sch Comp Sci Zhangzhou 363000 Fujian Peoples R China|Fujian Prov Univ Key Lab Data Sci & Intelligence Applicat Zhangzhou 363000 Fujian Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Inter-level error propagation; Capped l(2)-norm; Data outliers; Feature selection; Hierarchical classification;

    机译:级别的帧间错误传播;LAPPED L(2)-NORM;数据转口;特征选择;分层分类;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号