首页> 外文会议>Artificial neural networks in pattern recognition >Incremental Feature Selection by Block Addition and Block Deletion Using Least Squares SVRs
【24h】

Incremental Feature Selection by Block Addition and Block Deletion Using Least Squares SVRs

机译:使用最小二乘SVR通过块添加和块删除来选择增量特征

获取原文
获取原文并翻译 | 示例

摘要

For a small sample problem with a large number of features, feature selection by cross-validation frequently goes into random tie breaking because of the discrete recognition rate. This leads to inferior feature selection results. To solve this problem, we propose using a least squares support vector regressor (LS SVR), instead of an LS support vector machine (LS SVM). We consider the labels (1/-1) as the targets of the LS SVR and the mean absolute error by cross-validation as the selection criterion. By the use of the LS SVR, the selection and ranking criteria become continuous and thus tie breaking becomes rare. For evaluation, we use incremental block addition and block deletion of features that is developed for function approximation. By computer experiments, we show that performance of the proposed method is comparable with that with the criterion based on the weighted sum of the recognition error rate and the average margin error.
机译:对于具有大量特征的小样本问题,由于离散识别率,通过交叉验证进行的特征选择经常进入随机平局决胜。这会导致劣质的特征选择结果。为了解决此问题,我们建议使用最小二乘支持向量回归器(LS SVR)代替LS支持向量机(LS SVM)。我们将标签(1 / -1)作为LS SVR的目标,并通过交叉验证将平均绝对误差作为选择标准。通过使用LS SVR,选择和排名标准变得连续,因此打破平局的情况变得罕见。为了进行评估,我们使用为功能逼近而开发的功能增量块添加和块删除功能。通过计算机实验,我们证明了该方法的性能与基于识别误码率和平均裕度误码的加权和的判据具有可比性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号