首页> 外文学位 >Techniques in support vector classification.
【24h】

Techniques in support vector classification.

机译:支持向量分类的技术。

获取原文
获取原文并翻译 | 示例

摘要

This work falls into the field of Pattern Classification and more generally Artificial Intelligence. Classification is the problem of assigning a “pattern” z to be a member of a finite set (“class”) X or a member of a disjoint finite set Y. In case z∈Rn and X,Y⊂Rn we can solve this problem using Support Vector Machines. Support Vector Machines are functions of the form fz= signi aik xi,z +j bjky j,z+b , * where k:Rn×Rn →R and z is classified as a member of X = {lcub}xi{rcub} if f( z) > 0 and a member of Y = {lcub}y j{rcub} otherwise. We consider three problems in classification, two of which concern Support Vector Machines.; Our first problem concerns feature selection for classification. Feature selection is the problem of identifying properties which distinguish between the two classes X and Y. Color, for example, distinguishes between apples and oranges, while shape may not. Our method of feature selection uses a novel combination of a linear classifier known as Fisher's discriminant and a nonlinear (polynomial) map known as the Veronese map. We apply our method to a problem in materials design.; Our second problem concerns the selection of the kernel k:Rn×Rn →R in (∗). For kernel selection we use a kernel version of the classical Gram-Schmidt orthonormalization procedure again coupled with Fisher's discriminant. We apply our method to the materials design problem and to a handwritten digit recognition problem.; Finally, we consider the problem of training Support Vector Machines. Specifically, we develop a fast method for obtaining the coefficients α i and βj in (∗). Traditionally, these coefficients are found by solving a constrained quadratic programming problem. We present a geometric reformulation of the SVM quadratic programming problem. We then present, using this reformulation, a modified version of Gilbert's Algorithm for obtaining the coefficients αi and βj. We compare our algorithm with the Nearest Point Algorithm and with Sequential Minimal Optimization.
机译:这项工作属于模式分类领域,更广泛地属于人工智能领域。 分类是将“模式” z 分配为有限集(“类”) X 的成员或问题的问题。不相交的有限集 Y 。如果 z∈ R n X,Y⊂ R n 我们可以解决使用支持向量机解决此问题。 支持向量机的形式为 f z = sign i a i k x i z + j b j k y j ,z + b * < / fen> 其中, k R n ×R n →R z 如果 f <被归为 X = {lcub} x i {rcub}的成员/ italic>( z )> 0和 Y = {lcub} y j {rcub}否则。我们考虑了分类中的三个问题,其中两个涉及支持向量机。我们的第一个问题涉及分类的特征选择。 特征选择是识别属性的问题,该属性区分两个类别 X Y 。例如,颜色可以区分苹果和橘子,而形状则不能。我们的特征选择方法使用了称为Fisher判别式的线性分类器和称为Veronese映射的非线性(多项式)映射的新颖组合。我们将我们的方法应用于材料设计中的问题。我们的第二个问题涉及 kernel k R n 的选择(∗)中的×R n →R 。对于内核选择,我们再次使用经典的Gram-Schmidt正交归一化过程的内核版本以及Fisher的判别式。我们将我们的方法应用于材料设计问题和手写数字识别问题。最后,我们考虑训练支持向量机的问题。具体来说,我们开发了一种快速方法来获取(∗)中的系数 italic> i 和β j 。传统上,这些系数是通过解决约束二次规划问题找到的。我们提出了SVM二次规划问题的几何重构。然后,我们使用这种重新表示形式,提出了吉尔伯特算法的修改版本,该算法用于获取系数α i 和β j 。我们将算法与最近点算法和有序最小优化进行了比较。

著录项

  • 作者

    Martin, Shawn Bryan.;

  • 作者单位

    Colorado State University.;

  • 授予单位 Colorado State University.;
  • 学科 Mathematics.
  • 学位 Ph.D.
  • 年度 2001
  • 页码 88 p.
  • 总页数 88
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 数学;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号