首页> 外文期刊>Australian journal of intelligent information processing systems >Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions
【24h】

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

机译:基于离散Vapnik-Chervonenkis维的实函数类的误差界

获取原文
获取原文并翻译 | 示例
       

摘要

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer's Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the discretized VC dimension. As a byproduct, we present the equidistantly discretized VC dimension by introducing an equidistant partition to segmenting the range of a real function class. Finally, we obtain the error bounds for real function classes based on the discretized VC dimensions in the PAC-learning framework.
机译:Vapnik-Chervonenkis(VC)维在统计学习理论中起着重要作用。在本文中,我们提出了通过离散实函数类的范围而获得的离散VC维。然后,我们指出绍尔引理对于离散化VC维有效。通过使用离散化的VC维,我们将具有无限VC维的实函数类分为四类。作为副产品,我们通过引入等距分区来分割实函数类的范围来呈现等距离散的VC维。最后,我们根据PAC学习框架中离散的VC维数,获得了实函数类的误差范围。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号