首页> 外文学位 >Least square multi-class kernel machines with prior knowledge and applications.
【24h】

Least square multi-class kernel machines with prior knowledge and applications.

机译:具有先验知识和应用程序的最小二乘多类内核机器。

获取原文
获取原文并翻译 | 示例

摘要

In this study, the problem of discriminating between objects of two or more classes with (or without) prior knowledge is investigated. We present how a two-class discrimination model with or without prior knowledge can be extended to the case of multi-categorical discrimination with or without prior knowledge. The prior knowledge of interest is in the form of multiple polyhedral sets belonging to one or more categories, classes, or labels, and it is introduced as additional constraints into a classification model formulation. The solution of the knowledge-based support vector machine (KBSVM) model for two-class discrimination is characterized by a linear programming (LP) problem, and this is due to the specific norm (L1 or Linfinity) that is used to compute the distance between the two classes. We propose solutions to classification problems expressed as a single unconstrained optimization problem with (or without) prior knowledge via a regularized least square cost function in order to obtain a linear system of equations in input space and/or dual space induced by a kernel function that can be solved using matrix methods or iterative methods. Advantages of this formulation include the explicit expressions for the classification weights of the classifier(s); its ability to incorporate and handle prior knowledge directly to the classifiers; its ability to incorporate several classes in a single formulation and provide fast solutions to the optimal classification weights for multicategorical separation.; Comparisons with other learning techniques such as the least square SVM & MSVM developed by Suykens & Vandewalle (1999b & 1999c), and the knowledge-based SVM developed by Fung et al. (2002) indicate that the regularized least square methods are more efficient in terms of misclassification testing error and computational time.
机译:在这项研究中,研究了区分具有(或不具有)先验知识的两个或多个类的对象的问题。我们介绍了如何将具有或没有先验知识的两类歧视模型扩展到具有或没有先验知识的多类别歧视的情况。感兴趣的先验知识是属于一个或多个类别,类别或标签的多个多面体集的形式,并将其作为附加约束引入分类模型公式中。基于知识的支持向量机(KBSVM)模型用于两类区分的解决方案的特征在于线性规划(LP)问题,这是由于用于计算距离的特定范数(L1或Linfinity)引起的在两个类之间。我们提出通过正则化最小二乘成本函数表示为具有(或不具有)先验知识的单个无约束优化问题的分类问题的解决方案,以获得在输入​​空间和/或由核函数引起的对偶空间中的线性方程组可以使用矩阵法或迭代法求解。这种表述的优点包括对分类器的分类权重的明确表达;直接将分类器并入分类器并对其进行处理的能力;它具有在单一配方中包含多个类别并为多类别分离提供最佳分类权重的快速解决方案的能力。与其他学习技术的比较,例如Suykens&Vandewalle(1999b和1999c)开发的最小二乘SVM和MSVM,以及Fung等人开发的基于知识的SVM。 (2002年)表明,正则化最小二乘方法在误分类测试错误和计算时间方面更有效。

著录项

  • 作者

    Oladunni, Olutayo O.;

  • 作者单位

    The University of Oklahoma.;

  • 授予单位 The University of Oklahoma.;
  • 学科 Engineering Industrial.
  • 学位 Ph.D.
  • 年度 2006
  • 页码 179 p.
  • 总页数 179
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 一般工业技术;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号