...
首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Minimax Sparse Logistic Regression for Very High-Dimensional Feature Selection
【24h】

Minimax Sparse Logistic Regression for Very High-Dimensional Feature Selection

机译:用于极高维特征选择的Minimax稀疏Logistic回归

获取原文
获取原文并翻译 | 示例

摘要

Because of the strong convexity and probabilistic underpinnings, logistic regression (LR) is widely used in many real-world applications. However, in many problems, such as bioinformatics, choosing a small subset of features with the most discriminative power are desirable for interpreting the prediction model, robust predictions or deeper analysis. To achieve a sparse solution with respect to input features, many sparse LR models are proposed. However, it is still challenging for them to efficiently obtain unbiased sparse solutions to very high-dimensional problems (e.g., identifying the most discriminative subset from millions of features). In this paper, we propose a new minimax sparse LR model for very high-dimensional feature selections, which can be efficiently solved by a cutting plane algorithm. To solve the resultant nonsmooth minimax subproblems, a smoothing coordinate descent method is presented. Numerical issues and convergence rate of this method are carefully studied. Experimental results on several synthetic and real-world datasets show that the proposed method can obtain better prediction accuracy with the same number of selected features and has better or competitive scalability on very high-dimensional problems compared with the baseline methods, including the $ell_{1}$-regularized LR.
机译:由于强大的凸性和概率基础,逻辑回归(LR)在许多实际应用中得到了广泛使用。但是,在许多问题(例如生物信息学)中,需要选择具有最大判别力的一小部分特征来解释预测模型,进行可靠的预测或进行更深入的分析。为了获得关于输入特征的稀疏解决方案,提出了许多稀疏的LR模型。然而,对于他们来说,要有效地获得针对超高维问题的无偏稀疏解(例如,从数百万个特征中识别出最具区别性的子集)仍然是挑战。在本文中,我们针对超高维特征选择提出了一个新的极大极小稀疏LR模型,该模型可以通过剖切面算法有效地解决。为了解决由此产生的非光滑极小极大问题,提出了一种平滑坐标下降法。仔细研究了该方法的数值问题和收敛速度。在多个合成和真实数据集上的实验结果表明,与包括<公式公式类型>在内的基线方法相比,该方法可以在相同数量的选定特征下获得更好的预测精度,并且在极高维问题上具有更好的或具有竞争力的可扩展性。 =“ inline”> $ ell_ {1} $ -正规化的LR。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号