【24h】

Bayes and Tukey Meet at the Center Point

机译:贝叶斯和图基在中心点相遇

获取原文
获取原文并翻译 | 示例

摘要

The Bayes classifier achieves the minimal error rate by constructing a weighted majority over all concepts in the concept class. The Bayes Point uses the single concept in the class which has the minimal error. This way, the Bayes Point avoids some of the deficiencies of the Bayes classifier. We prove a bound on the generalization error for Bayes Point Machines when learning linear classifiers, and show that it is at most ~ 1.71 times the generalization error of the Bayes classifier, independent of the input dimension and length of training. We show that when learning linear classifiers, the Bayes Point is almost identical to the Tukey Median and Center Point. We extend these definitions beyond linear classifiers and define the Bayes Depth of a classifier. We prove generalization bound in terms of this new definition. Finally we provide a new concentration of measure inequality for multivariate random variables to the Tukey Median.
机译:贝叶斯分类器通过构造概念类中所有概念的加权多数来实现最小错误率。贝叶斯点在类中使用单个概念,该概念具有最小的错误。这样,贝叶斯点避免了贝叶斯分类器的某些不足。我们证明了学习线性分类器时贝叶斯点机器的泛化误差有一定的局限性,并且表明它是贝叶斯分类器泛化误差的至多约1.71倍,而与输入维数和训练时间无关。我们表明,当学习线性分类器时,贝叶斯点几乎与Tukey中位数和中心点相同。我们将这些定义扩展到线性分类器之外,并定义分类器的贝叶斯深度。我们证明了根据这个新定义的概括约束。最后,我们为多元随机变量向Tukey中位数提供了新的度量不等式集中。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号