首页> 外文会议>Annual conference on Neural Information Processing Systems >On statistical learning via the lens of compression
【24h】

On statistical learning via the lens of compression

机译:通过压缩的角度进行统计学习

获取原文

摘要

This work continues the study of the relationship between sample compression schemes and statistical learning, which has been mostly investigated within the framework of binary classification. The central theme of this work is establishing equivalences between learnability and compressibility, and utilizing these equivalences in the study of statistical learning theory. We begin with the setting of multiclass categorization (zero/one loss). We prove that in this case learnability is equivalent to compression of logarithmic sample size, and that uniform convergence implies compression of constant size. We then consider Vapnik's general learning setting: we show that in order to extend the compressibility-learnability equivalence to this case, it is necessary to consider an approximate variant of compression. Finally, we provide some applications of the compressibility-leamability equivalences.
机译:这项工作继续了对样本压缩方案与统计学习之间关系的研究,该研究主要是在二进制分类的框架内进行的。这项工作的中心主题是建立可学习性与可压缩性之间的对等关系,并在统计学习理论的研究中利用这些对等关系。我们从多类分类(零/一损失)的设置开始。我们证明在这种情况下,可学习性等同于对数样本大小的压缩,并且均匀收敛意味着压缩大小不变。然后,我们考虑Vapnik的一般学习设置:我们表明,为了将可压缩性-学习性等效性扩展到这种情况,有必要考虑压缩的近似变体。最后,我们提供了可压缩性-等效性的一些应用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号