...
首页> 外文期刊>Knowledge-Based Systems >Multiple kernel subspace clustering with local structural graph and low-rank consensus kernel learning
【24h】

Multiple kernel subspace clustering with local structural graph and low-rank consensus kernel learning

机译:具有局部结构图和低秩共识核学习的多核子空间聚类

获取原文
获取原文并翻译 | 示例
           

摘要

Multiple kernel learning (MKL) methods are generally believed to perform better than single kernel learning (SKL) methods in handling nonlinear subspace clustering problem, largely thanks to MKL avoids selecting and tuning a pre-defined kernel. However, previous MKL methods mainly focused on how to define a kernel weighting strategy, but ignored the structural characteristics of the input data in both the original space and the kernel space. In this paper, we first propose a novel graph based MKL method for subspace clustering, namely, Local Structural Graph and Low-Rank Consensus Multiple Kernel Learning (LLMKL). It jointly learns an optimal affinity graph and a suitable consensus kernel for clustering purpose by elegantly integrating the MKL technology, the global structure in the kernel space, the local structure in the original space, and the Hilbert space self-expressiveness property in a unified optimization model. In particular, to capture the data global structure, we employ a substitute of the desired consensus kernel, and then introduce a low-rank constraint on the substitute to encourage that the structure of linear subspaces is present in the feature space. Moreover, the data local structure is explored by building a complete graph, where each sample is treated as a node, and an edge codes the pairwise affinity between two samples. By such, the consensus kernel learning and the affinity graph learning can promote each other such that the data in resulting Hilbert space are both self-expressive and low-rank. Experiments on both image and text clustering well demonstrate that LLMKL outperforms the state-of-the-art methods. (C) 2019 Elsevier B.V. All rights reserved.
机译:通常认为,在处理非线性子空间聚类问题时,多核学习(MKL)方法比单核学习(SKL)方法的性能更好,这在很大程度上要归功于MKL避免了选择和调整预定义的内核。但是,以前的MKL方法主要关注于如何定义内核加权策略,却忽略了原始空间和内核空间中输入数据的结构特征。在本文中,我们首先提出了一种基于图的MKL子空间聚类新方法,即局部结构图和低秩共识多核学习(LLMKL)。它通过将MKL技术,内核空间中的全局结构,原始空间中的局部结构以及Hilbert空间的自表现性完美地集成在一起,共同学习一个最佳的亲和图和一个适合聚类的共识内核。模型。特别是,为了捕获数据全局结构,我们采用了所需共识内核的替代物,然后对替代物引入了低秩约束,以鼓励特征空间中存在线性子空间的结构。此外,通过建立完整的图来探索数据局部结构,其中将每个样本视为一个节点,并且边沿编码两个样本之间的成对亲和力。这样,共识核学习和亲和图学习可以彼此促进,使得所得希尔伯特空间中的数据既是自我表达的又是低等级的。在图像和文本聚类上的实验都很好地证明了LLMKL优于最新方法。 (C)2019 Elsevier B.V.保留所有权利。

著录项

  • 来源
    《Knowledge-Based Systems》 |2020年第5期|105040.1-105040.9|共9页
  • 作者

  • 作者单位

    Southwest Univ Sci & Technol Sch Natl Def Sci & Technol Mianyang 621010 Sichuan Peoples R China|Nanjing Univ Sci & Technol Dept Comp Sci Nanjing 210094 Peoples R China;

    Southwest Univ Sci & Technol Sch Informat Engn Mianyang 621010 Sichuan Peoples R China;

    Southwest Univ Sci & Technol Sch Natl Def Sci & Technol Mianyang 621010 Sichuan Peoples R China;

    Nanjing Univ Sci & Technol Dept Comp Sci Nanjing 210094 Peoples R China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Multiple kernel learning; Subspace clustering; Self-expressiveness; Structure learning; Low-rank kernel;

    机译:多核学习;子空间聚类;自我表达;结构学习;低阶内核;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号