首页> 外文会议>Annual conference on Neural Information Processing Systems >Predictive PAC Learning and Process Decompositions
【24h】

Predictive PAC Learning and Process Decompositions

机译:预测PAC学习和过程分解

获取原文

摘要

We informally call a stochastic process learnable if it admits a generalization error approaching zero in probability for any concept class with finite VC-dimension (IID processes are the simplest example). A mixture of learnable processes need not be learnable itself, and certainly its generalization error need not decay at the same rate. In this paper, we argue that it is natural in predictive PAC to condition not on the past observations but on the mixture component of the sample path. This definition not only matches what a realistic learner might demand, but also allows us to sidestep several otherwise grave problems in learning from dependent data. In particular, we give a novel PAC generalization bound for mixtures of learnable processes with a generalization error that is not worse than that of each mixture component. We also provide a characterization of mixtures of absolutely regular (β-mixing) processes, of independent probability-theoretic interest.
机译:如果允许有限VC维度的任何概念类的概率接近零(IID进程是最简单示例)的概率,我们非正式地调用了随机进程。学习过程的混合不需要自身被学习,并且当然,其泛化误差不需要以相同的速率衰减。在本文中,我们认为它在预测PAC中是天然的,条件不在过去观察中,而是在样品路径的混合物组分上。这种定义不仅与现实的学习者可能需求的需求匹配,而且还允许我们在依赖数据中学习几个否则的问题。特别是,我们给出了一种新的PAC概括,所述PAC概括为具有比每个混合物组分的概念性误差的概念性误差的混合物。我们还提供绝对常规(β混合)过程的混合物的表征,具有独立的概率 - 理论感兴趣。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号