【24h】

Learning Deep Kernels in the Space of Dot Product Polynomials

机译:在点积多项式空间中学习深核

获取原文

摘要

Recent literature has shown the merits of having deep representations in the context of neural networks. An emerging challenge in kernel learning is the definition of similar deep representations. In this paper, we propose a general methodology to define a hierarchy of base kernels with increasing expressiveness and combine them via multiple kernel learning (MKL) with the aim to generate overall deeper kernels. As a leading example, this methodology is applied to learning the kernel in the space of Dot-Product Polynomials (DPPs), that is a positive combination of homogeneous polynomial kernels (HPKs). We show theoretical properties about the expressiveness of HPKs that make their combination empirically very effective. This can also be seen as learning the coefficients of the Maclaurin expansion of any definite positive dot product kernel thus making our proposed method generally applicable. We empirically show the merits of our approach comparing the effectiveness of the kernel generated by our method against baseline kernels (including homogeneous and non homogeneous polynomials, RBF, etc..) and against another hierarchical approach on several benchmark datasets.
机译:最近的文献表明了在神经网络的背景下具有深刻陈述的优点。内核学习中的新出现挑战是类似深度表示的定义。在本文中,我们提出了一种通用方法,以通过增加富有效力来定义基础内核的层次结构,并通过多个内核学习(MKL)将它们与旨在生成整体更深的内核。作为领先的例子,该方法应用于在点 - 产品多项式(DPP)的空间中学习内核,即均匀多项式核(HPK)的正组合。我们向理论属性显示有关惠普的表现力,使其组合具有凭经质非常有效。这也可以看出,学习Maclaurin扩展的Maclaurin扩展的系数,从而使我们的提出方法通常适用。我们经验展示了我们的方法的优点,比较了我们对基线内核(包括均匀和非均质多项式,RBF等)的内核生成的内核的有效性以及在几个基准数据集上的另一种分层方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号