首页> 外文期刊>IEEE transactions on cognitive and developmental systems >Minimum Adversarial Distribution Discrepancy for Domain Adaptation
【24h】

Minimum Adversarial Distribution Discrepancy for Domain Adaptation

机译:Minimum Adversarial Distribution Discrepancy for Domain Adaptation

获取原文
获取原文并翻译 | 示例
           

摘要

Domain adaptation (DA) refers to generalize a learning technique across the source domain and target domain under different distributions. Therefore, the essential problem in DA is how to reduce the distribution discrepancy between the source and target domains. Typical methods are to embed the adversarial learning technique into deep networks to learn transferable feature representations. However, existing adversarial related DA methods may not sufficiently minimize the distribution discrepancy. In this article, a DA method minimum adversarial distribution discrepancy (MADD) is proposed by combining feature distribution with adversarial learning. Specifically, we design a novel divergence metric loss, named maximum mean discrepancy based on conditional entropy (MMD-CE), and embed it in the adversarial DA network. The proposed MMD-CE loss can address two problems: 1) the misalignment from different class distributions between domains and 2) the equilibrium challenge issue in adversarial DA. Comparative experiments on Office-31, ImageCLEF-DA, and Office-Home data sets with state-of-the-art methods show that our method has some advantageous performances.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号