首页> 外文期刊>JMLR: Workshop and Conference Proceedings >A Unified Dynamic Approach to Sparse Model Selection
【24h】

A Unified Dynamic Approach to Sparse Model Selection

机译:稀疏模型选择的统一动态方法

获取原文
           

摘要

Sparse model selection is ubiquitous from linear regression to graphical models where regularization paths, as a family of estimators upon the regularization parameter varying, are computed when the regularization parameter is unknown or decided data-adaptively. Traditional computational methods rely on solving a set of optimization problems where the regularization parameters are fixed on a grid that might be inefficient. In this paper, we introduce a simple iterative regularization path, which follows the dynamics of a sparse Mirror Descent algorithm or a generalization of Linearized Bregman Iterations with nonlinear loss. Its performance is competitive to glmnet with a further bias reduction. A path consistency theory is presented that under the Restricted Strong Convexity and the Irrepresentable Condition, the path will first evolve in a subspace with no false positives and reach an estimator that is sign-consistent or of minimax optimal $ell_2$ error rate. Early stopping regularization is required to prevent overfitting. Application examples are given in sparse logistic regression and Ising models for NIPS coauthorship.
机译:稀疏模型选择是从线性回归到图形模型的线性回归,当正则化参数变化时作为估计族的族族的族族复归,当正则化参数未知或自适应地决定数据时,计算。传统的计算方法依赖于解决一系列优化问题,其中正则化参数固定在可能效率低下的网格上。在本文中,我们介绍了一种简单的迭代正则化路径,其遵循稀疏镜像缩小算法的动态或具有非线性损耗的线性化Bregman迭代的概括。其性能与GLMNET具有竞争力,具有进一步的偏见。介绍了路径一致性理论,在受限制的强大凸起和IRREPRESENTABLE条件下,路径将首先在没有误报的子空间中发展,并达到符号或最低限度最佳$ ELL_2 $错误率的估计器。需要提前停止正规化以防止过度装备。应用示例是稀疏的逻辑回归和NIPS共同奉献的课程模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号