首页> 外文会议>Asia-Pacific Web Conference(APWeb 2004); 20040414-20040417; Hangzhou; CN >An Incremental Updating Method for Support Vector Machines
【24h】

An Incremental Updating Method for Support Vector Machines

机译:支持向量机的增量更新方法

获取原文
获取原文并翻译 | 示例

摘要

Support Vector Machines (SVMs) have become a popular tool for learning with large amounts of high dimensional data. But it may sometimes be preferable to learn incrementally from previous SVM results, as SVMs which involve the solution of a quadratic programming problem suffer from the problem of large memory requirement and CPU time when trained in batch mode on large data sets. And the SVMs may be used in online learning setting. In this paper an approach for incremental learning with Support Vector Machines is presented. We define the normal solution of the incremental learning for SVMs which is defined as the solution minimizing a given positive-definite quadratic form in the coordinates of the difference vector between the normal vectors at the (k-1)-th and k-th incremental step and discuss the relation to standard SVM. It was shown that concept learned at last step will not change if new data satisfy separable condition and empirical evidence is given to prove that this approach can effectively deal with changes in the target concept that are results of the incremental learning setting according to three evaluation criteria: stability, improvement and recoverability.
机译:支持向量机(SVM)已成为用于学习大量高维数据的流行工具。但是,有时最好从以前的SVM结果中逐步学习,因为涉及到二次编程问题的SVM在批量模式下对大数据集进行训练时会遇到内存需求大和CPU时间大的问题。 SVM可以用于在线学习环境。在本文中,提出了一种使用支持​​向量机进行增量学习的方法。我们定义了SVM增量学习的法线解,其定义为在第(k-1)和第k增量的法线向量之间的差异向量的坐标中最小化给定正定二次形式的解逐步讨论与标准SVM的关系。结果表明,如果新数据满足可分离条件,则在最后一步学习到的概念将不会改变,并提供经验证据证明该方法可以有效地应对目标概念的变化,这是根据三个评估标准对增量学习设置的结果:稳定性,改进性和可恢复性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号