首页> 外文会议>Annual Conference on Computational Learning Theory >Bayes and Tukey Meet at the Center Point
【24h】

Bayes and Tukey Meet at the Center Point

机译:贝叶斯和Tukey在中心点见面

获取原文

摘要

The Bayes classifier achieves the minimal error rate by constructing a weighted majority over all concepts in the concept class. The Bayes Point [1] uses the single concept in the class which has the minimal error. This way, the Bayes Point avoids some of the deficiencies of the Bayes classifier. We prove a bound on the generalization error for Bayes Point Machines when learning linear classifiers, and show that it is at most ~1.71 times the generalization error of the Bayes classifier, independent of the input dimension and length of training. We show that when learning linear classifiers, the Bayes Point is almost identical to the Tukey Median [2] and Center Point [3]. We extend these definitions beyond linear classifiers and define the Bayes Depth of a classifier. We prove generalization bound in terms of this new definition. Finally we provide a new concentration of measure inequality for multivariate random variables to the Tukey Median.
机译:贝叶斯分类器通过在概念类中的所有概念构建加权大部分来实现最小的错误率。贝叶斯点[1]使用具有最小误差的类中的单个概念。这样,贝叶斯点避免了贝叶斯分类器的一些缺陷。在学习线性分类器时,我们证明了贝叶斯点机器的泛化误差的界限,并显示了贝叶斯分类器的泛化误差最多约为1.71倍,与输入维度和培训长度无关。我们表明,当学习线性分类器时,贝叶斯点几乎与Tukey中位数[2]和中心点[3]相同。我们将这些定义扩展到线性分类器之外,并定义分类器的贝叶斯深度。我们证明了在这种新定义方面界定的泛化。最后,我们为多元随机变量提供了一种新的测量不平等,对Tukey中位数进行多变量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号