首页> 外文会议>AAAI Conference on Artificial Intelligence >On Power-Law Kernels, Corresponding Reproducing Kernel Hilbert Space and Applications
【24h】

On Power-Law Kernels, Corresponding Reproducing Kernel Hilbert Space and Applications

机译:关于幂律内核,相应再现核心核心区空间和应用

获取原文

摘要

The role of kernels is central to machine learning. Motivated by the importance of power-law distributions in statistical modeling, in this paper, we propose the notion of power-law kernels to investigate power-laws in learning problem. We propose two power-law kernels by generalizing Gaussian and Laplacian kernels. This generalization is based on distributions, arising out of maximization of a generalized information measure known as nonextensive entropy that is very well studied in statistical mechanics. We prove that the proposed kernels are positive definite, and provide some insights regarding the corresponding Reproducing Kernel Hilbert Space (RKHS). We also study practical significance of both kernels in classification and regression, and present some simulation results.
机译:内核的作用是机器学习的核心。通过统计建模中的幂律分布的重要性,本文提出了潜力法内核的概念来调查学习问题的权力法。我们通过推广高斯和拉普拉斯内核提出了两个幂法内核。该概率基于分布,从最大化被称为非奇异熵的广义信息措施的最大化产生,这在统计力学中非常良好地研究。我们证明,所提出的内核是积极的,并提供关于相应再现内核希尔伯特空间(RKHS)的一些见解。我们还研究了内核在分类和回归中的实际意义,并呈现了一些模拟结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号