...
首页> 外文期刊>Theoretical computer science >Aspects of complexity of probabilistic learning under monotonicity constraints
【24h】

Aspects of complexity of probabilistic learning under monotonicity constraints

机译:单调约束下概率学习的复杂性

获取原文
获取原文并翻译 | 示例
           

摘要

In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints, even if the probability is close to 1, provided the teaming machines are restricted to proper or class preserving hypothesis spaces (cf. Meyer, Theoret. Comput. Sci. 185 (1997) 81-128). In this paper, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. In particular, we deal with the question how much additional information provided by oracles is necessary for compensating the additional power of probabilistic learning machines. In Section 1. we show that K is necessary and sufficient to compensate the additional power of probabilistic learning machines in the case of conservative (monotonic) probabilistic learning with p > 1/2 (p > 2/3), and for strong-monotonic probabilistic learning with 1/2 < p less than or equal to2/3. In the case of strong-monotonic teaming with p > 2/3, however, every Peano-complete oracle is sufficient for compensating the power of probabilistic teaming machines. In contrast, the oracle H is not sufficient for compensating the power of conservative and strong-monotonic probabilistic learning with probability p = 1/2, and monotonic probabilistic learning with p = 2/3. The main result in Section 2 is that for each oracle A less than or equal to (T) K, there exists an indexed family L-A which is properly conservatively identifiable with p = 1/2, and which exactly reflects the Turing degree of A, i.e., L-A is properly conservatively identifiable by an oracle machine M[B] iff A less than or equal to (T) B. Thus, for every oracle A below K, we can construct a learning problem characterizing A within proper conservative learning. However, not every indexed family which is conservatively identifiable with probability p = 1/2 reflects the Turing degree of an oracle. Hence, the conservative probabilistic learning classes are higher structured than the Turing degrees below K. Finally, we prove that there exist learning problems which are conservatively (monotonically) identifiable with probability p = 1/2 (p = 2/3), but conservatively (monotonically) identifiable only by oracle machines having access to TOT strong-monotonic learning, this result does not hold. (C) 2001 Elsevier Science B.V. All rights reserved. [References: 44]
机译:在以学习为索引的家庭的情况下,在单调性约束下的概率学习比在单调性约束下的确定性学习更强大,即使概率接近于1,但前提是将分组机限制在适当的或类别保留的假设空间内(参见Meyer, Theoret.Comput.Sci.185(1997)81-128)。在本文中,我们研究了单调约束下概率学习与预言之间的关系。特别是,我们要解决这样一个问题,即由oracle提供的额外信息需要多少才能补偿概率学习机的额外功能。在第1节中,我们证明了在p> 1/2(p> 2/3)的保守(单调)概率学习情况下以及强单调的情况下,K是必要且足以补偿概率学习机的附加能力的1/2 小于或等于2/3的概率学习。但是,在p> 2/3的强单调分组中,每个Peano完全预言都足以补偿概率分组机的功能。相反,预言H不足以补偿概率p = 1/2的保守和强单调概率学习以及p = 2/3的单调概率学习的能力。第2节的主要结果是,对于每个小于或等于(T)K的预言子A,都存在一个索引族LA,该族可以通过p = 1/2保守地正确识别,并且准确地反映了A的图灵度,也就是,LA可以由小于或等于(T)B的预言机M [B]适当地保守地识别。因此,对于K以下的每个预言人A,我们都可以构造一个在适当的保守学习中表征A的学习问题。但是,并非每个可以通过概率p = 1/2保守识别的索引族都反映了甲骨文的图灵度。因此,保守概率学习类的结构要比K以下的图灵度更高。最后,我们证明存在一些学习问题,这些问题可以概率p = 1/2(p = 2/3)保守地(单调地)识别。 (单调)只能由有权访问TOT强单调学习的Oracle机器识别,此结果不成立。 (C)2001 Elsevier Science B.V.保留所有权利。 [参考:44]

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号