首页> 外文期刊>Machine Learning >Learning deep kernels in the space of dot product polynomials
【24h】

Learning deep kernels in the space of dot product polynomials

机译:在点积多项式空间中学习深核

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Recent literature has shown the merits of having deep representations in the context of neural networks. An emerging challenge in kernel learning is the definition of similar deep representations. In this paper, we propose a general methodology to define a hierarchy of base kernels with increasing expressiveness and combine them via multiple kernel learning (MKL) with the aim to generate overall deeper kernels. As a leading example, this methodology is applied to learning the kernel in the space of Dot-Product Polynomials (DPPs), that is a positive combination of homogeneous polynomial kernels (HPKs). We show theoretical properties about the expressiveness of HPKs that make their combination empirically very effective. This can also be seen as learning the coefficients of the Maclaurin expansion of any definite positive dot product kernel thus making our proposed method generally applicable. We empirically show the merits of our approach comparing the effectiveness of the kernel generated by our method against baseline kernels (including homogeneous and non homogeneous polynomials, RBF, etc...) and against another hierarchical approach on several benchmark datasets.
机译:最近的文献显示了在神经网络中具有深层表示的优点。内核学习中的一个新挑战是相似深度表示的定义。在本文中,我们提出了一种通用的方法来定义具有不断增强的表现力的基本内核层次结构,并通过多内核学习(MKL)将它们组合起来,以生成整体更深的内核。作为一个领先的例子,该方法论被用于学习点积多项式(DPP)空间中的核,这是齐次多项式核(HPK)的正组合。我们展示了有关HPK表现力的理论性质,这些经验使它们的组合在经验上非常有效。这也可以看作是学习任何确定的正点积核的Maclaurin展开的系数,从而使我们提出的方法可以普遍应用。我们从经验上展示了我们的方法的优缺点,将我们的方法生成的内核的有效性与基准内核(包括齐次多项式和非齐次多项式,RBF等)以及在多个基准数据集上的另一种分层方法进行了比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号