【24h】

Deep Learning in Semantic Kernel Spaces

机译:语义内核空间中的深度学习

获取原文

摘要

Kernel methods enable the direct usage of structured representations of textual data during language learning and inference tasks. Expressive kernels, such as Tree Kernels, achieve excellent performance in NLP. On the other side, deep neural networks have been demonstrated effective in automatically learning feature representations during training. However, their input is tensor data, i.e., they cannot manage rich structured information. In this paper, we show that expressive kernels and deep neural networks can be combined in a common framework in order to (i) explicitly model structured information and (ii) learn non-linear decision functions. We show that the input layer of a deep architecture can be pre-trained through the application of the Nystroem low-rank approximation of kernel spaces. The resulting "kernelized" neural network achieves state-of-the-art accuracy in three different tasks.
机译:内核方法可以在语言学习和推理任务期间直接使用文本数据的结构化表示。诸如Tree Kernels之类的富有表现力的内核在NLP中实现了出色的性能。另一方面,深度神经网络已被证明可以有效地在训练过程中自动学习特征表示。但是,它们的输入是张量数据,即它们无法管理丰富的结构化信息。在本文中,我们证明了可以将表达内核和深度神经网络组合在一个通用框架中,以便(i)显式地对结构化信息进行建模,以及(ii)学习非线性决策函数。我们表明,可以通过应用Nystroem内核空间低秩逼近来对深度体系结构的输入层进行预训练。最终的“内核化”神经网络在三种不同的任务中均达到了最新的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号