首页> 外文会议>The Sixteenth IEEE International Conference on Computational Science and Engineering >Subset Selection Classifier (SSC): A Training Set Reduction Method
【24h】

Subset Selection Classifier (SSC): A Training Set Reduction Method

机译:子集选择分类器(SSC):一种训练集约简方法

获取原文
获取原文并翻译 | 示例

摘要

Instance-based learning algorithms are often required to choose which instances to store for use during classification. Keeping too many instances usually results in more storage and processing time requirements during classification. Many attempts have been made to reduce the size of the training set. The major drawback of majority of these attempts is their expensive learning process that limits their application in practical domains. In this paper, we propose a new training set reduction algorithm called Subset Selection Classifier (SSC), which chooses a minimal subset by performing an incremental search in the training set. SSC extends the nearest neighbor concept by constructing several circular regions in the training sample and building a model by collecting the central instance of each circular region along its radius. A test instance is classified by the selected instances if it falls within the radius of any selected instance. Experimental evaluation against 12 existing techniques on 11 benchmark datasets show that SSC has the best accuracy as well as the best reduction of the size of the training set in the average case.
机译:通常需要基于实例的学习算法来选择存储哪些实例以供分类时使用。保留太多实例通常会导致分类期间需要更多的存储和处理时间。为了减小训练集的大小,已经进行了许多尝试。这些尝试中的大多数的主要缺点是它们昂贵的学习过程,这限制了它们在实际领域中的应用。在本文中,我们提出了一种新的训练集约简算法,称为子集选择分类器(SSC),该算法通过在训练集中执行增量搜索来选择最小子集。 SSC通过在训练样本中构造几个圆形区域并通过收集沿其半径的每个圆形区域的中心实例来构建模型来扩展最近邻概念。如果测试实例落入任何选定实例的半径之内,则按选定实例对其进行分类。在11个基准数据集上对12种现有技术进行的实验评估表明,在平均情况下,SSC具有最佳的准确度以及最大程度地减少了训练集的大小。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号