首页> 美国政府科技报告 >Metric Entropy, Vapnik-Chervonenkis Dimension, and Learnability for a Class of Distributions.
【24h】

Metric Entropy, Vapnik-Chervonenkis Dimension, and Learnability for a Class of Distributions.

机译:度量熵,Vapnik-Chervonenkis维度和一类分布的可学习性。

获取原文

摘要

A formal framework for distribution-free concept known as Valiant's learning framework has generated a great deal of interest. A fundamental result regarding this framework characterizes those concept classes which are learnable in terms of their Vapnik-Chervonenkis (VC) dimension. More recently, learnability in this case was shown. Also a conjecture regarding learnability for a class of distributions was stated. In this report, we first point out that the condition for learnability for a fixed distribution is equivalent to the notion of finite metric entropy (which has been studied in other contexts). Some relationships between the VC dimension of a concept class and its metric entropy with respect to various distributions are then discussed. Finally, we prove some indication of when the set of learnable concept classes is enlarged by requiring learnability for only a class of distributions. (kr)

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号