首页> 外文会议>International Conference on Multimedia Modeling >Interactive Search and Exploration in Discussion Forums Using Multimodal Embeddings
【24h】

Interactive Search and Exploration in Discussion Forums Using Multimodal Embeddings

机译:使用多模式嵌入的讨论论坛中的交互式搜索和探索

获取原文

摘要

In this paper we present a novel interactive multimodal learning system, which facilitates search and exploration in large networks of social multimedia users. It allows the analyst to identify and select users of interest, and to find similar users in an interactive learning setting. Our approach is based on novel multimodal representations of users, words and concepts, which we simultaneously learn by deploying a general-purpose neural embedding model. The usefulness of the approach is evaluated using artificial actors, which simulate user behavior in a relevance feedback scenario. Multiple experiments were conducted in order to evaluate the quality of our multimodal representations and compare different embedding strategies. We demonstrate the capabilities of the proposed approach on a multimedia collection originating from the violent online extremism forum Stormfront, which is particularly interesting due to the high semantic level of the discussions it features.
机译:在本文中,我们提出了一种新颖的交互式多模式学习系统,该系统便于在大型社交多媒体用户网络中进行搜索和探索。它使分析人员可以识别和选择感兴趣的用户,并在交互式学习环境中找到相似的用户。我们的方法基于用户,单词和概念的新颖多模式表示形式,我们通过部署通用神经嵌入模型来同时学习它们。使用人工参与者评估该方法的有效性,该人工参与者在相关性反馈场景中模拟用户行为。为了评估我们的多模式表示的质量并比较不同的嵌入策略,进行了多次实验。我们在源自暴力在线极端主义论坛Stormfront的多媒体集合上展示了所建议方法的功能,由于其讨论的语义层次很高,因此该方法特别有趣。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号