...
首页> 外文期刊>Applied stochastic models in business and industry >A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers
【24h】

A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers

机译:大余量分类器统一框架下多类支持向量机的比较研究

获取原文
获取原文并翻译 | 示例
           

摘要

Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.
机译:Vapnik的统计学习理论主要针对两种类型的问题而开发:模式识别(二分法计算)和回归(实数值函数的估计)。仅在最近几年,才对多类判别分析进行独立研究。扩展了几个标准结果,包括巴特利特的一个著名定理,我们得出了无分布的,分布均匀的,强大的,大量的定律,专门用于多类大边际判别模型。置信区间中出现的容量度量(覆盖数)已根据新的广义VC维从上方进行了界定。本文将上述定理应用于到目前为止提出的所有多类SVM共享的体系结构,这为我们提供了研究它们,比较它们的性能和设计新机器的简单理论框架。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号