【24h】

A Method for Large-Scale e_1 -Regularized Logistic Regression

机译:大规模e_1正则Logistic回归的方法

获取原文
获取原文并翻译 | 示例

摘要

Logistic regression with e_1 regularization has been proposed as a promising method for feature selection in classification problems. Several specialized solution methods have been proposed for e_1 -regularized logistic regression problems (LRPs). However, existing methods do not scale well to large problems that arise in many practical settings. In this paper we describe an efficient interior-point method for solving e_1 -regularized LRPs. Small problems with up to a thousand or so features and examples can be solved in seconds on a PC. A variation on the basic method, that uses a preconditioned conjugate gradient method to compute the search step, can solve large sparse problems, with a million features and examples (e.g., the 20 Newsgroups data set), in a few tens of minutes, on a PC. Numerical experiments show that our method outperforms standard methods for solving convex optimization problems as well as other methods specifically designed for e_1-regularized LRPs.
机译:已经提出了使用e_1正则化的逻辑回归作为分类问题中特征选择的有前途的方法。已经针对e_1正规化的逻辑回归问题(LRP)提出了几种专门的解决方法。但是,现有方法不能很好地解决许多实际环境中出现的大问题。在本文中,我们描述了一种有效的内点方法,用于求解e_1正规化的LRP。多达一千个左右的功能和示例的小问题可以在PC上在几秒钟内解决。基本方法的一种变体,使用预处理的共轭梯度方法来计算搜索步骤,可以在几十分钟内解决具有100万个功能和示例(例如20个新闻组数据集)的大型稀疏问题。一台电脑。数值实验表明,我们的方法优于解决凸优化问题的标准方法,以及专门为e_1-正规化LRP设计的其他方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号