【24h】

Multiple Operator-valued Kernel Learning

机译:多运算符值的内核学习

获取原文

摘要

Positive definite operator-valued kernels generalize the well-known notion of reproducing kernels, and are naturally adapted to multi-output learning situations. This paper addresses the problem of learning a finite linear combination of infinite-dimensional operator-valued kernels which are suitable for extending functional data analysis methods to nonlinear contexts. We study this problem in the case of kernel ridge regression for functional responses with an ℓ_γ-norm constraint on the combination coefficients (γ > 1). The resulting optimization problem is more involved than those of multiple scalar-valued kernel learning since operator-valued kernels pose more technical and theoretical issues. We propose a multiple operator-valued kernel learning algorithm based on solving a system of linear operator equations by using a block coordinate-descent procedure. We experimentally validate our approach on a functional regression task in the context of finger movement prediction in brain-computer interfaces.
机译:正定运算符值内核概括了众所周知的再现内核概念,并且自然适用于多输出学习情况。本文解决了学习无限维算子值内核的有限线性组合的问题,该组合适合将功能数据分析方法扩展到非线性上下文。我们在核函数响应具有with_γ范数约束且组合系数(γ> 1)的函数响应的核岭回归的情况下研究此问题。由于运算符值的内核带来了更多的技术和理论问题,因此所产生的优化问题比多标量值的内核学习所涉及的优化问题更为复杂。我们提出了一种基于块坐标下降法求解线性算子方程组的多算子值核学习算法。我们在脑机接口的手指运动预测的背景下,通过实验验证了我们在功能回归任务上的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号