首页> 外文期刊>Statistics and Its Interface >Discussion on “Doubly sparsity kernel learning with automatic variable selection and data extraction”
【24h】

Discussion on “Doubly sparsity kernel learning with automatic variable selection and data extraction”

机译:关于“具有自动变量选择和数据提取功能的双稀疏核学习”的讨论

获取原文
           

摘要

Kernel methods provide powerful and flexible tools for nonlinear learning in high dimensional data analysis, but feature selection remains a challenge in kernel learning. The proposed DOSK method provides a new unified framework to implement kernel methods while automatically selecting important variables and identifying a subset of parsimonious knots at the same time. A double penalty is employed to encourage sparsity in both feature weights and representer coefficients. The authors have presented the computational algorithm and as well as theoretical properties of the DOSK method. In this discussion, we first highlight the DOSK’s major contributions to the machine learning toolbox. Then we discuss its connections to other nonparametric methods in the literature and point out some possible future research directions.
机译:内核方法为高维数据分析中的非线性学习提供了强大而灵活的工具,但是在内核学习中,特征选择仍然是一个挑战。所提出的DOSK方法提供了一个新的统一框架,可以在自动选择重要变量并同时识别简约节的子集的同时实现内核方法。采用双重惩罚以鼓励特征权重和代表系数的稀疏性。作者介绍了DOSK方法的计算算法和理论特性。在讨论中,我们首先强调DOSK对机器学习工具箱的主要贡献。然后,我们讨论了它与文献中其他非参数方法的联系,并指出了一些可能的未来研究方向。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号