首页> 外文会议>International Conference on Multimedia Modeling >Interactive Search and Exploration in Discussion Forums Using Multimodal Embeddings
【24h】

Interactive Search and Exploration in Discussion Forums Using Multimodal Embeddings

机译:使用多模式嵌入式讨论论坛的互动搜索和探索

获取原文

摘要

In this paper we present a novel interactive multimodal learning system, which facilitates search and exploration in large networks of social multimedia users. It allows the analyst to identify and select users of interest, and to find similar users in an interactive learning setting. Our approach is based on novel multimodal representations of users, words and concepts, which we simultaneously learn by deploying a general-purpose neural embedding model. The usefulness of the approach is evaluated using artificial actors, which simulate user behavior in a relevance feedback scenario. Multiple experiments were conducted in order to evaluate the quality of our multimodal representations and compare different embedding strategies. We demonstrate the capabilities of the proposed approach on a multimedia collection originating from the violent online extremism forum Stormfront, which is particularly interesting due to the high semantic level of the discussions it features.
机译:在本文中,我们提出了一种新颖的交互式多模阶学习系统,这促进了社会多媒体用户的大型网络中的搜查和探索。它允许分析师识别和选择感兴趣的用户,并在交互式学习设置中找到类似的用户。我们的方法是基于用户,单词和概念的新多模式表示,我们通过部署通用神经嵌入模型同时学习。使用人工参与者评估该方法的有用性,其在相关反馈场景中模拟用户行为。进行多个实验,以评估我们多峰表示的质量,并比较不同的嵌入策略。我们展示了征收来自暴力在线极端主义论坛暴风雨的多媒体收集的拟议方法的能力,这是由于它具有讨论的高语义水平,这是特别有趣的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号