【24h】

Square Penalty Support Vector Regression

机译:平方罚分支持向量回归

获取原文
获取原文并翻译 | 示例

摘要

Support Vector Regression (SVR) is usually pursued using the ∈-insensitive loss function while, alternatively, the initial regression problem can be reduced to a properly denned classification one. In either case, slack variables have to be introduced in practical interesting problems, the usual choice being the consideration of linear penalties for them. In this work we shall discuss the solution of an SVR problem recasting it first as a classification problem and working with square penalties. Besides a general theoretical discussion, we shall also derive some consequences for regression problems of the coefficient structure of the resulting SVMs and illustrate the procedure on some standard problems widely used as benchmarks and also over a wind energy forecasting problem.
机译:支持向量回归(SVR)通常是使用ε不敏感损失函数进行的,同时,也可以将初始回归问题简化为适当确定的分类之一。无论哪种情况,都必须在实际有趣的问题中引入松弛变量,通常的选择是考虑线性罚分。在这项工作中,我们将讨论SVR问题的解决方案,首先将其重新分类为分类问题并进行平方罚款。除了一般性的理论讨论外,我们还将为所得支持向量机的系数结构的回归问题得出一些后果,并举例说明一些广泛用作基准的标准问题以及风能预测问题的程序。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号