...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Open Problem: Information Complexity of VC Learning
【24h】

Open Problem: Information Complexity of VC Learning

机译:打开问题:VC学习的信息复杂性

获取原文
           

摘要

Uniform convergence approaches learning by studying the complexity of hypothesis classes. In particular, hypothesis classes with bounded Vapnik-Chervonenkis dimension exhibit strong uniform convergence, that is, any hypothesis in the class has low generalization error. On the other hand, a long line of work studies the information complexity of a learning algorithm, as it is connected to several desired properties, including generalization. We ask whether all VC classes admit a learner with low information complexity which achieves the generalization bounds guaranteed by uniform convergence. Specifically, since we know that this is not possible if we consider proper and consistent learners and measure information complexity in terms of the mutual information (Bassily et al., 2018), we are interested in learners with low information complexity measured in terms of the recently introduced notion of CMI (Steinke and Zakynthinou, 2020). Can we obtain tight bounds on the information complexity of a learning algorithm for a VC class (via CMI), thus exactly retrieving the known generalization bounds implied for this class by uniform convergence?
机译:通过研究假设类别的复杂性来统一的收敛方法。特别地,具有有界VAPNIK-Chervonenkis尺寸的假设类表现出强烈的均匀收敛,即该类中的任何假设都具有低的泛化误差。另一方面,长期工作研究了学习算法的信息复杂性,因为它连接到几个所需的属性,包括概括。我们询问所有VC类是否承认具有低信息复杂性的学习者,该复杂性通过均匀收敛而达到泛化界限。具体来说,因为我们知道这是不可能的,如果我们考虑适当和一致的学习者并在互信息方面测量信息复杂性(Bassily等,2018),我们对学习者感兴趣,以便在最近引入了CMI的概念(Steinke和Zakynthinou,2020)。我们可以在VC类(通过CMI)的学习算法的信息复杂性上获得紧张的界限,从而通过均匀的收敛来准确地检索该类的已知概述界限?

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号