首页> 外文会议>International Conference on Algorithmic Learning Theory >Monotone Conditional Complexity Bounds on Future Prediction Errors
【24h】

Monotone Conditional Complexity Bounds on Future Prediction Errors

机译:单调条件复杂性界限未来的预测错误

获取原文

摘要

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume we are at a time t > 1 and already observed x = x_1...x_t. We bound the future prediction performance on x_(t+1)x_(t+2)... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
机译:我们在在线预测任何(可计算)随机序列时绑定未来的损失。 Solomonoff通过μm的算法复杂性,在真正的分布μ的真正分布μ是有限的界限界定了他的通用预测器M的总偏差。在这里,我们假设我们是一个时间t> 1并且已经观察到x = x_1 ... x_t。通过给定X的次算法复杂度的新变体,我们在X_(T + 1)X_(T + 2)中绑定了未来的预测性能,以及X随机性缺乏的复杂性。新的复杂性在其条件下是单调的,因为这种复杂性只有在延长条件时才能降低。我们还简要讨论潜在的概念到贝叶斯模型课程和分类问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号