首页> 外文会议>International Conference on Computational Statistics >Data Dependent Priors in PAC-Bayes Bounds
【24h】

Data Dependent Priors in PAC-Bayes Bounds

机译:PAC-贝叶斯界的数据依赖性前锋

获取原文

摘要

One of the central aims of Statistical Learning Theory is the bounding of the test set performance of classifiers trained with i.i.d. data. For Support Vector Machines the tightest technique for assessing this so-called generalisation error is known as the PAC-Bayes theorem. The bound holds independently of the choice of prior, but better priors lead to sharper bounds. The priors leading to the tightest bounds to date are spherical Gaussian distributions whose means are determined from a separate subset of data. This paper gives another turn of the screw by introducing a further data dependence on the shape of the prior: the separate data set determines a direction along which the covariance matrix of the prior is stretched in order to sharpen the bound. In addition, we present a classification algorithm that aims at minimizing the bound as a design criterion and whose generalisation can be easily analysed in terms of the new bound. The experimental work includes a set of classification tasks preceded by a bound-driven model selection. These experiments illustrate how the new bound acting on the new classifier can be much tighter than the original PAC-Bayes Bound applied to an SVM, and lead to more accurate classifiers.
机译:统计学习理论的中央目标之一是对I.I.D训练的分类器的测试集性能的界限。数据。对于支持向量机,用于评估该所谓的泛化误差的最紧密技术被称为PAC-Bayes定理。绑定独立于先前的选择,但更好的前锋导致漂亮的界限。迄今为止最紧密的界限的前沿是球形高斯分布,其手段由单独的数据子集确定。本文通过对先前的形状引入进一步的数据依赖性来提供另一种转动螺钉:单独的数据集确定先前的协方差矩阵的方向以锐化绑定。此外,我们提出了一种分类算法,其旨在最小化作为设计标准的绑定,并且在新界限方面可以容易地分析其概括。实验工作包括一组分类任务,前面是绑定驱动的模型选择。这些实验说明了在新分类器上的新绑定如何比应用于SVM的原始PAC-Bayes更紧密,并导致更准确的分类器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号