...
首页> 外文期刊>International journal of machine learning and cybernetics >v-soft margin multi-task learning logistic regression
【24h】

v-soft margin multi-task learning logistic regression

机译:v-soft margin多任务学习逻辑回归

获取原文
获取原文并翻译 | 示例
           

摘要

Coordinate descent (CD) is an effective method for large scale classification problems with simple operations and fast convergence speed. In this paper, inspired by v-soft margin support vector machine and multi-task learning support vector machine for classification, a novel v-soft margin multi-task learning logistic regression (v-SMMTL-LR) for pattern classification is proposed to improve the generalization performance of logistic regression (LR). The dual of v-SMMTL-LR can be viewed as dual coordinate descent (CDdual) problem with equality constraint and then its large scale classification method named v-SMMTL-LR-CDdual is developed. The proposed method v-SMMTL-LR-CDdual can maximize the between-class margin and effectively improve the generalization performance of LR for large scale multi-task learning scenarios. Experimental results show that the proposed method v-SMMTL-LR-CDdual is effective for large scale multi-task datasets or comparatively high dimensional multi-task datasets and that it is competitive to other related single-task and multi-task learning algorithms.
机译:坐标下降法(CD)是一种用于大规模分类问题的有效方法,其操作简单且收敛速度快。本文基于v-soft margin支持向量机和多任务学习支持向量机进行分类,提出了一种新型的v-margin多任务学习逻辑回归(v-SMMTL-LR)进行模式分类。 Logistic回归(LR)的泛化性能。可以将v-SMMTL-LR的对偶视为具有等式约束的双坐标下降(CDdual)问题,然后开发其大规模分类方法,称为v-SMMTL-LR-CDdual。提出的v-SMMTL-LR-CDdual方法可以最大程度地提高类间余量,并有效地提高LR在大规模多任务学习场景下的泛化性能。实验结果表明,所提出的方法v-SMMTL-LR-CDdual对大规模多任务数据集或相对高维的多任务数据集有效,并且与其他相关的单任务和多任务学习算法相比具有竞争优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号