首页> 外文会议>International Conference on Algorithmic Learning Theory >Concentration and Confidence for Discrete Bayesian Sequence Predictors
【24h】

Concentration and Confidence for Discrete Bayesian Sequence Predictors

机译:离散贝叶斯序列预测因子的集中和信心

获取原文

摘要

Bayesian sequence prediction is a simple technique for predicting future symbols sampled from an unknown measure on infinite sequences over a countable alphabet. While strong bounds on the expected cumulative error are known, there are only limited results on the distribution of this error. We prove tight high-probability bounds on the cumulative error, which is measured in terms of the Kullback-Leibler (KL) divergence. We also consider the problem of constructing upper confidence bounds on the KL and Hellinger errors similar to those constructed from Hoeffding-like bounds in the i.i.d. case. The new results are applied to show that Bayesian sequence prediction can be used in the Knows What It Knows (KWIK) framework with bounds that match the state-of-the-art.
机译:贝叶斯序列预测是一种简单的技术,用于在可数字母表上预测从无限序列上取样的未来符号采样的简单技术。虽然已知预期累积误差的强界,但仅导致此错误的分布有限。我们在累积误差上证明了紧密的高概率界限,这是根据kullback-leibler(kl)发散来衡量的。我们还考虑与I.I.D中的类似Hoeffding的边界构建的KL和Hellinger误差构建kl和hellinger误差的上置信度的问题。案件。应用新结果显示,可以在知道它所知(kwik)框架与符合最先进的界限的知识中使用的贝叶斯序列预测。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号