【24h】

A Novel Bayesian Ensemble Pruning Method

机译:贝叶斯合奏的一种新修剪方法

获取原文

摘要

In ensemble learning, ensemble pruning is a procedure that aims at removing the unnecessary base classifiers and retaining the best subset of the base classifiers. We presented a two-step ensemble pruning framework, in which the optimal size of the pruned ensemble is first decided, and then with the optimal size as input, the optimal ensemble is selected. For the first step to find the optimal ensemble size, we presented an algorithm that can be proved to be able to find the Bayesian optimal ensemble size. For the second step, we developed two greedy forward pruning methods, i.e., the Bayesian Pruning (BP) method and the Bayesian Independent Pruning (BIP) method. In the BP method, we assumed that the probability of a candidate ensemble to be the optimal ensemble follows the Generalized Beta distribution. And in the BIP method, we further assumed that whether a base classifier belongs to the optimal ensemble is independent to the other base classifiers. Experimental results on twenty benchmark data sets showed that the BP and BIP methods achieved competitive performance in contrast to other state-of-the-art algorithms.
机译:在集成学习中,集成修剪是旨在删除不必要的基本分类器并保留基本分类器的最佳子集的过程。我们提出了一个两步的集合修剪框架,在该框架中,首先确定修剪集合的最佳大小,然后以最佳大小为输入,选择最佳集合。对于寻找最佳集合体大小的第一步,我们提出了一种算法,该算法可以证明能够找到贝叶斯最佳集合体大小。对于第二步,我们开发了两种贪婪的正向修剪方法,即贝叶斯修剪(BP)方法和贝叶斯独立修剪(BIP)方法。在BP方法中,我们假设候选集合成为最佳集合的概率遵循广义Beta分布。并且在BIP方法中,我们进一步假设一个基本分类器是否属于最优集合是独立于其他基本分类器的。在20个基准数据集上的实验结果表明,与其他最新算法相比,BP和BIP方法具有竞争优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号