首页> 外文会议>Pacific-Asia conference on advances in knowledge discovery and data mining;PAKDD 2012 >Domain Transfer Dimensionality Reduction via Discriminant Kernel Learning
【24h】

Domain Transfer Dimensionality Reduction via Discriminant Kernel Learning

机译:通过判别核学习减少域转移维数

获取原文
获取外文期刊封面目录资料

摘要

Kernel discriminant analysis (KDA) is a popular technique for discriminative dimensionality reduction in data analysis. But, when a limited number of labeled data is available, it is often hard to extract the required low dimensional representation from a high dimensional feature space. Thus, one expects to improve the performance with the labeled data in other domains. In this paper, we propose a method, referred to as the domain transfer discriminant kernel learning (DTDKL), to find the optimal kernel by using the other labeled data from out-of-domain distribution to carry out discriminant dimensionality reduction. Our method learns a kernel function and discriminative projection by maximizing the Fisher discriminant distance and minimizing the mismatch between the in-domain and out-of-domain distributions simultaneously, by which we may get a better feature space for discriminative dimensionality reduction with cross-domain.
机译:核判别分析(KDA)是一种用于减少数据分析中的判别维数的流行技术。但是,当有限数量的标记数据可用时,通常很难从高维特征空间中提取所需的低维表示。因此,人们期望通过其他域中的标记数据来提高性能。在本文中,我们提出一种称为域转移判别核学习(DTDKL)的方法,该方法通过使用来自域外分布的其他标记数据来进行判别维数减少,从而找到最佳核。我们的方法通过最大化Fisher判别距离并最小化域内和域外分布之间的不匹配来学习核函数和判别式投影,从而可以为跨域的判别维数减少提供更好的特征空间。 。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号