首页> 外文期刊>Computational mathematics and mathematical physics >An Efficient Method for Feature Selection in Linear RegressionBased on an Extended Akaike's Information Criterion
【24h】

An Efficient Method for Feature Selection in Linear RegressionBased on an Extended Akaike's Information Criterion

机译:基于扩展的赤池信息准则的线性回归特征选择的有效方法

获取原文
获取原文并翻译 | 示例
           

摘要

A method for feature selection in linear regression based on an extension of Akaike's infor-mation criterion is proposed. The use of classical Akaike's information criterion (AIC) for featureselection assumes the exhaustive search through all the subsets of features, which has unreasonablyhigh computational and time cost. A new information criterion is proposed that is a continuous exten-sion of AIC. As a result, the feature selection problem is reduced to a smooth optimization problem.An efficient procedure for solving this problem is derived. Experiments show that the proposedmethod enables one to efficiently select features in linear regression. In the experiments, the proposedprocedure is compared with the relevance vector machine, which is a feature selection method basedon Bayesian approach. It is shown that both procedures yield similar results. The main distinction ofthe proposed method is that certain regularization coefficients are identical zeros. This makes it pos-sible to avoid the underfitting effect, which is a characteristic feature of the relevance vector machine.A special case (the so-called nondiagonal regularization) is considered in which both methods areidentical.
机译:提出了一种基于Akaike信息准则扩展的线性回归特征选择方法。使用经典Akaike信息准则(AIC)进行特征选择时,假设要对特征的所有子集进行详尽搜索,这会造成不合理的高计算和时间成本。提出了一种新的信息标准,该标准是AIC的连续扩展。结果,将特征选择问题简化为平滑优化问题,并得出了解决该问题的有效程序。实验表明,该方法使人们能够有效地选择线性回归中的特征。在实验中,将所提出的程序与基于贝叶斯方法的特征选择方法相关向量机进行了比较。结果表明,两种方法均产生相似的结果。该方法的主要区别在于某些正则化系数为零。这使得有可能避免欠拟合效应,这是相关向量机的一个特征。考虑一种特殊情况(所谓的非对角正则化),其中两种方法是相同的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号