首页> 外文会议>International Symposium on Neural Networks >The Bounds on the Rate of Uniform Convergence for Learning Machine
【24h】

The Bounds on the Rate of Uniform Convergence for Learning Machine

机译:用于学习机均匀收敛速率的界限

获取原文

摘要

The generalization performance is the important property of learning machines. The desired learning machines should have the quality of stability with respect to the training samples. We consider the empirical risk minimization on the function sets which are eliminated noisy. By applying the Kutin's inequality we establish the bounds of the rate of uniform convergence of the empirical risks to their expected risks for learning machines and compare the bounds with known results.
机译:泛化性能是学习机器的重要属性。所需的学习机应该具有关于训练样本的稳定性的质量。我们考虑在消除嘈杂的功能集上的经验风险最小化。通过应用Kutin的不等式,我们建立了对学习机器的预期风险统一收敛速度的界限,并将其与已知结果的界限进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号