【24h】

BAYESIAN PREFERENCE LEARNING FOR MULTICLASS PROBLEMS

机译:多类问题的贝叶斯偏好学习

获取原文
获取原文并翻译 | 示例

摘要

There are some methods reducing a multiclass classification problem to two-class classification problems. Such methods do not learn direct classifier X → Y (X is an input data set, Y is a class set), but they learn an indirect relation, such as, X x Y~2 → {0,1}. The technique is sometime used because of some advantages (expectation of improvement accuracy and availability of two-class learners). In this paper, we propose a reducing method using an idea to estimate learning error probabilities. We derived an equation from Bayesian theorem. We performed experiments of 18 UCI datasets. With seven datasets a proposing classifier showed superiority to C4.5, and no inferiority to it.
机译:有一些方法可以将多类分类问题简化为两类分类问题。此类方法不学习直接分类器X→Y(X是输入数据集,Y是类集),但是学习间接关系,例如X x Y〜2→{0,1}。由于某些优势(期望提高准确性和两类学习者的可用性),有时会使用该技术。在本文中,我们提出了一种使用想法来估计学习错误概率的简化方法。我们从贝叶斯定理推导了一个方程。我们进行了18个UCI数据集的实验。通过七个数据集,一个提议的分类器显示出优于C4.5的优势,并且不逊于其。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号