首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Average-Case Information Complexity of Learning
【24h】

Average-Case Information Complexity of Learning

机译:平均情况下学习的信息复杂性

获取原文
           

摘要

How many bits of information are revealed by a learning algorithm for a concept class of VC-dimension $d$? Previous works have shown that even for $d=1$ the amount of information may be unbounded (tend to $infty$ with the universe size). Can it be that all concepts in the class require leaking a large amount of information? We show that typically concepts do not require leakage. There exists a proper learning algorithm that reveals $O(d)$ bits of information for most concepts in the class. This result is a special case of a more general phenomenon we explore. If there is a low information learner when the algorithm emph{knows} the underlying distribution on inputs, then there is a learner that reveals little information on an average concept emph{without knowing} the distribution on inputs.
机译:学习算法针对VC维度$ d $的概念类揭示了多少位信息?先前的工作表明,即使对于$ d = 1 $,信息量也可能是无限制的(倾向于使用全域大小的$ infty $)。难道该类中的所有概念都需要泄漏大量信息吗?我们表明,通常概念不需要泄漏。存在一种适当的学习算法,该算法可以揭示类中大多数概念的$ O(d)$位信息。这个结果是我们探索的更普遍现象的特例。如果当算法 emph {知道}输入的基础分布时信息学习者较少,则有一个学习者 emph {不知道}输入的分布信息很少。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号