首页> 外文会议>International Symposium on Neural Networks(ISNN 2005) pt.1; 20050530-0601; Chongqing(CN) >The Bounds on the Rate of Uniform Convergence for Learning Machine
【24h】

The Bounds on the Rate of Uniform Convergence for Learning Machine

机译:学习机均匀收敛速度的界线

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

The generalization performance is the important property of learning machines. The desired learning machines should have the quality of stability with respect to the training samples. We consider the empirical risk minimization on the function sets which are eliminated noisy. By applying the Kutin's inequality we establish the bounds of the rate of uniform convergence of the empirical risks to their expected risks for learning machines and compare the bounds with known results.
机译:泛化性能是学习机的重要属性。所需的学习机应具有相对于训练样本的稳定性。我们考虑消除噪声的功能集上的经验风险最小化。通过应用Kutin不等式,我们确定了经验风险与其学习机的预期风险的均匀收敛速度的界限,并将界限与已知结果进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号