首页> 外文期刊>Neural computation >Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
【24h】

Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery

机译:内核化的弹性网正则化:泛化边界和稀疏恢复

获取原文
获取原文并翻译 | 示例

摘要

Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, ). The kernel in KENReg is not required to be a Mercer kernel since it learns from a kernelized dictionary in the coefficient space. Feng, Yang, Zhao, Lv, and Suykens () showed that KENReg has some nice properties including stability, sparseness, and generalization. In this letter, we continue our study on KENReg by conducting a refined learning theory analysis. This letter makes the following three main contributions. First, we present refined error analysis on the generalization performance of KENReg. The main difficulty of analyzing the generalization error of KENReg lies in characterizing the population version of its empirical target function. We overcome this by introducing a weighted Banach space associated with the elastic net regularization. We are then able to conduct elaborated learning theory analysis and obtain fast convergence rates under proper complexity and regularity assumptions. Second, we study the sparse recovery problem in KENReg with fixed design and show that the kernelization may improve the sparse recovery ability compared to the classical elastic net regularization. Finally, we discuss the interplay among different properties of KENReg that include sparseness, stability, and generalization. We show that the stability of KENReg leads to generalization, and its sparseness confidence can be derived from generalization. Moreover, KENReg is stable and can be simultaneously sparse, which makes it attractive theoretically and practically.
机译:核化弹性网正则化(KENReg)是众所周知的弹性网正则化(Zou&Hastie,)的内核化。 KENReg中的内核不需要是Mercer内核,因为它可以从系数空间中的内核化字典中学习。 Feng,Yang,Zhao,Lv和Suykens()表明KENReg具有一些不错的特性,包括稳定性,稀疏性和泛化性。在这封信中,我们将通过进行精细的学习理论分析来继续我们对KENReg的研究。这封信做出了以下三个主要贡献。首先,我们对KENReg的泛化性能进行精确的误差分析。分析KENReg泛化误差的主要困难在于表征其经验目标函数的总体版本。我们通过引入与弹性净正则化相关的加权Banach空间克服了这一问题。然后,我们能够进行详尽的学习理论分析,并在适当的复杂性和规律性假设下获得快速收敛速度。其次,我们以固定设计研究了KENReg中的稀疏恢复问题,并表明与经典的弹性网正则化相比,核化可以提高稀疏恢复能力。最后,我们讨论了KENReg的不同属性之间的相互作用,包括稀疏性,稳定性和泛化性。我们表明,KENReg的稳定性导致泛化,并且其稀疏置信度可以从泛化中得出。此外,KENReg稳定并且可以同时稀疏,这使其在理论和实践上都具有吸引力。

著录项

  • 来源
    《Neural computation》 |2016年第3期|525-562|共38页
  • 作者单位

    Department of Electrical Engineering, ESAT-STADIUS, KU Leuven 3000, Belgium yunlong.feng@esat.kuleuven.be;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号