首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Error bounds for sparse classifiers in high-dimensions
【24h】

Error bounds for sparse classifiers in high-dimensions

机译:高维中稀疏分类器的错误界限

获取原文
           

摘要

We prove an L2 recovery bound for a family of sparse estimators defined as minimizers of some empirical loss functions – which include hinge loss and logistic loss. More precisely, we achieve an upper-bound for coefficients estimation scaling as $(kst/n)log(p/kst)$: n,p is the size of the design matrix and k* the dimension of the theoretical loss minimizer. This is done under standard assumptions, for which we derive stronger versions of a cone condition and a restricted strong convexity. Our bound holds with high probability and in expectation and applies to an L1-regularized estimator and to a recently introduced Slope estimator, which we generalize for classification problems. Slope presents the advantage of adapting to unknown sparsity. Thus, we propose a tractable proximal algorithm to compute it and assess its empirical performance. Our results match the best existing bounds for classification and regression problems.
机译:我们证明了一系列稀疏估计系列的L2恢复,定义为一些经验损失功能的最小值 - 包括铰链损耗和物流损失。更确切地说,我们实现了系数估计缩放的上限作为$(k ast / n) log(p / k ast)$:n,p是设计矩阵的大小和k *的尺寸理论损耗最小化器。这是根据标准假设完成的,我们推导出锥形条件的更强烈版本和限制的强凸。我们的绑定具有很高的概率和期望,并适用于L1-正规化的估计器,并介绍了最近引入的斜率估计器,我们概括了分类问题。斜坡呈现为不明稀疏性的优势。因此,我们提出了一种易诊的近端算法来计算它并评估其经验性能。我们的结果与分类和回归问题的最佳现有范围匹配。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号