首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Sparse + Group-Sparse Dirty Models: Statistical Guarantees without Unreasonable Conditions and a Case for Non-Convexity
【24h】

Sparse + Group-Sparse Dirty Models: Statistical Guarantees without Unreasonable Conditions and a Case for Non-Convexity

机译:稀疏+群体稀疏模型:无不合理条件的统计保证和非凸性

获取原文
       

摘要

Imposing sparse + group-sparse superposition structures in high-dimensional parameter estimation is known to provide flexible regularization that is more realistic for many real-world problems. For example, such a superposition enables partially-shared support sets in multi-task learning, thereby striking the right balance between parameter overlap across tasks and task specificity. Existing theoretical results on estimation consistency, however, are problematic as they require too stringent an assumption: the incoherence between sparse and group-sparse superposed components. In this paper, we fill the gap between the practical success and suboptimal analysis of sparse + group-sparse models, by providing the first consistency results that do not require unrealistic assumptions. We also study non-convex counterparts of sparse + group-sparse models. Interestingly, we show that these are guaranteed to recover the true support set under much milder conditions and with smaller sample size than convex models, which might be critical in practical applications as illustrated by our experiments.
机译:众所周知,在高维参数估计中强加稀疏+组稀疏叠加结构可提供灵活的正则化,这对于许多现实世界中的问题而言更为现实。例如,这种叠加可以在多任务学习中实现部分共享的支持集,从而在跨任务的参数重叠和任务特异性之间取得适当的平衡。但是,有关估计一致性的现有理论结果存在问题,因为它们要求过于严格的假设:稀疏分量和组稀疏叠加分量之间的不一致性。在本文中,我们通过提供不需要不切实际的假设的第一一致性结果,填补了实际成功与稀疏+群稀疏模型的次优分析之间的空白。我们还研究稀疏+组稀疏模型的非凸对等形式。有趣的是,我们证明了这些保证可以在比凸模型更温和的条件下和以较小的样本量恢复真实的支持集,这在实际应用中可能至关重要,如我们的实验所示。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号