首页> 外文期刊>Neurocomputing >A novel classifier ensemble method with sparsity and diversity
【24h】

A novel classifier ensemble method with sparsity and diversity

机译:一种具有稀疏性和多样性的分类器集成方法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

We consider the classifier ensemble problem in this paper. Due to its superior performance to individual classifiers, class ensemble has been intensively studied in the literature. Generally speaking, there are two prevalent research directions on this, i.e., to diversely generate classifier components, and to sparsely combine multiple classifiers. While most current approaches are emphasized on either sparsity or diversity only, we investigate the classifier ensemble by learning both sparsity and diversity simultaneously. We manage to formulate the classifier ensemble problem with the sparsity or/and diversity learning in a general framework. In particular, the classifier ensemble with sparsity and diversity can be represented as a mathematical optimization problem. We then propose a heuristic algorithm, capable of obtaining ensemble classifiers with consideration of both sparsity and diversity. We exploit the genetic algorithm, and optimize sparsity and diversity for classifier selection and combination heuristically and iteratively. As one major contribution, we introduce the concept of the diversity contribution ability so as to select proper classifier components and evolve classifier weights eventually. Finally, we compare our proposed novel method with other conventional classifier ensemble methods such as Bagging, least squares combination, sparsity learning, and AdaBoost, extensively on UCI benchmark data sets and the Pascal Large Scale Learning Challenge 2008 webspam data. The experimental results confirm that our approach leads to better performance in many aspects.
机译:我们考虑本文中的分类器集成问题。由于其优于单个分类器的性能,因此在文献中对类集合进行了深入研究。一般而言,对此有两个流行的研究方向,即,多样化地生成分类器组件,以及稀疏地组合多个分类器。虽然大多数当前方法仅强调稀疏性或多样性,但我们通过同时学习稀疏性和多样性来研究分类器集合。我们设法在通用框架中用稀疏性和/或多样性学习来制定分类器集成问题。特别地,具有稀疏性和多样性的分类器集合可以表示为数学优化问题。然后,我们提出一种启发式算法,能够同时考虑稀疏性和多样性来获得整体分类器。我们利用遗传算法,通过启发式和迭代式优化分类器的选择和组合的稀疏性和多样性。作为一项主要贡献,我们引入了多样性贡献能力的概念,以便选择合适的分类器组件并最终发展分类器权重。最后,我们在UCI基准数据集和Pascal Large Scale Learning Challenge 2008 Webspam数据上广泛地将我们提出的新颖方法与其他常规分类器集成方法(例如Bagging,最小二乘法,稀疏学习和AdaBoost)进行比较。实验结果证实,我们的方法可以在许多方面带来更好的性能。

著录项

  • 来源
    《Neurocomputing》 |2014年第25期|214-221|共8页
  • 作者单位

    Department of Computer Science and Technology, School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China;

    Department of Electrical and Electronic Engineering, Xi'an Jiaotong-Liverpool University, Suzhou 215123, China;

    Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China;

    Department of Computer Science and Technology, School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China;

    China National Engineering Research Center for Information Technology in Agriculture, Beijing 100097, China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Classifier ensemble; Sparsity learning; Diversity learning; Neural network ensembles; Genetic algorithm;

    机译:分类器集合;稀疏学习;多样性学习;神经网络集成;遗传算法;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号