首页> 外文会议>AAAI Conference on Artificial Intelligence >A Method for Large-Scale e_1 -Regularized Logistic Regression
【24h】

A Method for Large-Scale e_1 -Regularized Logistic Regression

机译:大规模E_1 -Regulared Logistic回归的方法

获取原文

摘要

Logistic regression with e_1 regularization has been proposed as a promising method for feature selection in classification problems. Several specialized solution methods have been proposed for e_1 -regularized logistic regression problems (LRPs). However, existing methods do not scale well to large problems that arise in many practical settings. In this paper we describe an efficient interior-point method for solving e_1 -regularized LRPs. Small problems with up to a thousand or so features and examples can be solved in seconds on a PC. A variation on the basic method, that uses a preconditioned conjugate gradient method to compute the search step, can solve large sparse problems, with a million features and examples (e.g., the 20 Newsgroups data set), in a few tens of minutes, on a PC. Numerical experiments show that our method outperforms standard methods for solving convex optimization problems as well as other methods specifically designed for e_1-regularized LRPs.
机译:已经提出了具有E_1正则化的Logistic回归作为分类问题中的特征选择的有希望的方法。已经提出了几种专用解决方案方法,用于E_1 -Regulared Logistic回归问题(LRPS)。但是,现有方法对许多实际设置中出现的大问题不符。在本文中,我们描述了一种用于求解E_1 -Regulared LRP的有效内部点方法。在PC上的几秒钟内可以在几秒钟内解决高达千分之一的功能和示例的小问题。基本方法的变化,它使用预处理的共轭梯度方法来计算搜索步骤,可以解决大量稀疏问题,具有一百万个特征和示例(例如,20新闻组数据集),在几十分钟内,在几十分钟内一台PC。数值实验表明,我们的方法优于求解凸优化问题的标准方法以及专门为E_1-Rasalized LRPS设计的其他方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号