...
首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >Joint Ranking SVM and Binary Relevance with robust Low-rank learning for multi-label classification
【24h】

Joint Ranking SVM and Binary Relevance with robust Low-rank learning for multi-label classification

机译:与多标签分类的强大低秩学习联合排名SVM和二进制相关性

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Multi-label classification studies the task where each example belongs to multiple labels simultaneously. As a representative method, Ranking Support Vector Machine (Rank-SVM) aims to minimize the Ranking Loss and can also mitigate the negative influence of the class-imbalance issue. However, due to its stacking-style way for thresholding, it may suffer error accumulation and thus reduces the final classification performance. Binary Relevance (BR) is another typical method, which aims to minimize the Hamming Loss and only needs one-step learning. Nevertheless, it might have the class-imbalance issue and does not take into account label correlations. To address the above issues, we propose a novel multi-label classification model, which joints Ranking support vector machine and Binary Relevance with robust Low-rank learning (RBRL). RBRL inherits the ranking loss minimization advantages of Rank-SVM, and thus overcomes the disadvantages of BR suffering the class-imbalance issue and ignoring the label correlations. Meanwhile, it utilizes the hamming loss minimization and one-step learning advantages of BR, and thus tackles the disadvantages of Rank-SVM including another thresholding learning step. Besides, a low-rank constraint is utilized to further exploit high-order label correlations under the assumption of low dimensional label space. Furthermore, to achieve nonlinear multi-label classifiers, we derive the kernelization RBRL. Two accelerated proximal gradient methods (APG) are used to solve the optimization problems efficiently. Extensive comparative experiments with several state-of-the-art methods illustrate a highly competitive or superior performance of our method RBRL. (c) 2019 Elsevier Ltd. All rights reserved.
机译:多标签分类研究每个示例同时属于多个标签的任务。作为代表性的方法,排名支持向量机(RANK-SVM)旨在最大限度地减少排名损失,并且还可以减轻类别不平衡问题的负面影响。但是,由于其堆叠式方式用于阈值化,因此可能会遇到误差累积,从而降低最终的分类性能。二进制相关性(BR)是另一种典型方法,旨在最大限度地减少汉明损失,只需要一步学习。然而,它可能具有类别不平衡问题,并且不会考虑标签相关性。为了解决上述问题,我们提出了一种新的多标签分类模型,该模型是与强大的低秩学习(RBRL)的排名支持向量机和二进制相关性。 RBRL继承了秩SVM的排名损失最小化优势,从而克服了BR患有类别不平衡问题的缺点并忽略标签相关性。同时,它利用BR的汉明损失最小化和一步学习优势,从而解决了秩SVM的缺点,包括另一个阈值学习步骤。此外,利用低秩约束来在低维标记空间的假设下进一步利用高阶标签相关性。此外,为了实现非线性多标签分类器,我们得出了内核化RBRL。两种加速的近端梯度方法(APG)用于有效解决优化问题。具有多种最先进的方法的广泛比较实验说明了我们方法RBR的竞争力或优越的性能。 (c)2019年elestvier有限公司保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号