【24h】

DNF Are Teachable in the Average Case

机译:在一般情况下,DNF是可以教授的

获取原文
获取原文并翻译 | 示例

摘要

We study the average number of well-chosen labeled examples that are required for a helpful teacher to uniquely specify a target function within a concept class. This "average teaching dimension" has been studied in learning theory and combinatorics and is an attractive alternative to the "worst-case" teaching dimension of Goldman and Kearns which is exponential for many interesting concept classes. Recently Balbach showed that the classes of 1-decision lists and 2-term DNF each have linear average teaching dimension. As our main result, we extend Balbach's teaching result for 2-term DNF by showing that for any 1 ≤ s ≤ 2~(Θ(n)), the well-studied concept classes of at-most-s-term DNF and at-most-s-term monotone DNF each have average teaching dimension O(ns). The proofs use detailed analyses of the combinatorial structure of "most" DNF formulas and monotone DNF formulas. We also establish asymptotic separations between the worst-case and average teaching dimension for various other interesting Boolean concept classes such as juntas and sparse GF_2 polynomials.
机译:我们研究了帮助老师在概念课中唯一指定目标功能所需的精选标签示例的平均数量。已经在学习理论和组合学中研究了这种“平均教学维度”,它是高盛和Kearns的“最坏情况”教学维度的有吸引力的替代方法,对于许多有趣的概念课而言,这是指数级的。最近,巴尔巴赫(Balbach)表明,一类决策列表和二项DNF均具有线性平均教学维度。作为主要结果,我们扩展了Balbach关于2项DNF的教学结果,方法是证明对于任何1≤s≤2〜(Θ(n)),研究最多的D-term DNF和-最长期单调DNF的平均教学维度为O(ns)。证明使用“大多数” DNF公式和单调DNF公式的组合结构的详细分析。我们还为各种其他有趣的布尔概念类(例如juntas和稀疏GF_2多项式)在最坏情况和平均教学维度之间建立了渐近分离。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号