首页> 外文会议>European conference on computer vision >Inter-battery Topic Representation Learning
【24h】

Inter-battery Topic Representation Learning

机译:电池间主题表示学习

获取原文

摘要

In this paper, we present the Inter-Battery Topic Model (IBTM). Our approach extends traditional topic models by learning a factorized latent variable representation. The structured representation leads to a model that marries benefits traditionally associated with a discriminative approach, such as feature selection, with those of a generative model, such as principled regularization and ability to handle missing data. The factorization is provided by representing data in terms of aligned pairs of observations as different views. This provides means for selecting a representation that separately models topics that exist in both views from the topics that are unique to a single view. This structured consolidation allows for efficient and robust inference and provides a compact and efficient representation. Learning is performed in a Bayesian fashion by maximizing a rigorous bound on the log-likelihood. Firstly, we illustrate the benefits of the model on a synthetic dataset. The model is then evaluated in both uni- and multi-modality settings on two different classification tasks with off-the-shelf convolutional neural network (CNN) features which generate state-of-the-art results with extremely compact representations.
机译:在本文中,我们介绍了电池间主题模型(IBTM)。我们的方法通过学习分解潜变量表示来扩展传统主题模型。结构化表示导致嫁接传统上与识别方法相关的益处的模型,例如特征选择,与生成模型的那些,例如原则正则化和处理缺失数据的能力。通过以对对准的观察对作为不同视图的对齐对的数据来提供分解。这提供了用于选择表示的手段,即单独模拟两个视图中存在的主题,这些主题是单个视图唯一的主题。这种结构化整合允许有效且强大的推论,并提供紧凑且有效的表示。通过最大化日志可能性的严格束缚,在贝叶斯时尚中进行学习。首先,我们说明了模型对合成数据集的好处。然后,在两个不同的分类任务中,具有具有极其紧凑的表示的外部卷积神经网络(CNN)特征,在两种不同的分类任务中评估模型的两种不同的分类任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号