首页> 外文会议>Computational learning theory >Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability
【24h】

Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability

机译:有索引家庭的单调和双单调概率语言学习

获取原文
获取原文并翻译 | 示例

摘要

The present paper deals with monotonic and dual-monotonic probabilistic identification of indexed families of uniformly recursive languages from positive data. In particular, we consider the special case where the probability is equal to 1. Earlier results in the field of probabilistic identification established that - considering function identification - each collection of recursive functions identifiable with robability p>1/2 is deterministically identifiable (cf. [23]). In the case of language learning from text, each collection of recursive languages identifiable from text with probability p>2/3 is deterministically identifiable (cf.[20]). In particular, we have no gain of learning power when the collections of functions or languages are claimed to be inferred with probability p=1. As shown in [18], we receive high structured probabilistic hierarchies when dealing with probabilistic learning under monotonicity constraints. In this paper, we consider monotonic and dual monotonic probabilistic learning of indexed families with respect to proper, class preserving and class comprising hypothesis spaces. In particular, we can prove for proper monotonic as well as for proper dual monotonic learning that probabilistic learning is more powerful than deterministic learning even if the probability is claimed to be 1. To establish this result, we need a sophisticated version of the proof technique developed in [17].
机译:本文涉及从正数据中对统一递归语言的索引族进行单调和双单调概率识别。尤其是,我们考虑概率等于1的特殊情况。在概率识别领域中,较早的结果是确定的-考虑功能识别-可确定性> robability p> 1/2的递归功能的每个集合都可以确定地识别(cf. [23])。在从文本学习语言的情况下,从文本中以概率p> 2/3识别的递归语言的每个集合都可以确定地识别(参见[20])。特别是,当声称以概率p = 1推断功能或语言的集合时,我们没有学习能力。如图[18]所示,在单调性约束下处理概率学习时,我们会收到高度结构化的概率层次结构。在本文中,我们考虑了关于索引家庭的单调和对偶单调概率学习,它们涉及适当的,类别保留和包含假设空间的类别。特别是,我们可以证明对于适当的单调学习以及适当的对偶单调学习,即使声称概率为1,概率学习比确定性学习更强大。要建立此结果,我们需要复杂的证明技术在[17]中开发。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号