首页> 外文期刊>The Indian Journal of Statistics >The Horseshoe-Like Regularization for Feature Subset Selection
【24h】

The Horseshoe-Like Regularization for Feature Subset Selection

机译:特征子集选择的马蹄形正则化

获取原文
获取原文并翻译 | 示例
       

摘要

Feature subset selection arises in many high-dimensional applications of statistics, such as compressed sensing and genomics. Thel(0)penalty is ideal for this task, the caveat being it requires the NP-hard combinatorial evaluation of all models. A recent area of considerable interest is to develop efficient algorithms to fit models with a non-convexl(gamma)penalty for gamma is an element of (0,1), which results in sparser models than the convexl(1)or lasso penalty, but is harder to fit. We propose an alternative, termed the horseshoe regularization penalty for feature subset selection, and demonstrate its theoretical and computational advantages. The distinguishing feature from existing non-convex optimization approaches is a full probabilistic representation of the penalty as the negative of the logarithm of a suitable prior, which in turn enables efficient expectation-maximization and local linear approximation algorithms for optimization and MCMC for uncertainty quantification. In synthetic and real data, the resulting algorithms provide better statistical performance, and the computation requires a fraction of time of state-of-the-art non-convex solvers.
机译:特征子集选择出现在统计的许多高维应用中,例如压缩感测和基因组学。 Thel(0)惩罚是这个任务的理想选择,所需的警告需要所有型号的NP硬组合评估。最近的一个相当兴趣的区域是开发有效的算法,以适应具有用于伽玛的非凸(γ)惩罚的模型是(0,1)的元素,这导致稀疏模型而不是凸(1)或套索罚款,但更难适应。我们提出了一种替代方案,称为特征子集选择的马蹄正则惩罚,并展示其理论和计算优势。来自现有的非凸优化方法的区别特征是作为合适的先前的对数的惩罚的诸如惩罚的完全概率表示,其又可以实现有效的期望最大化和局部线性近似算法,以进行优化和MCMC以进行不确定量化。在合成和实际数据中,所得到的算法提供更好的统计性能,并且计算需要一小部分最先进的非凸溶剂。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号