首页> 外文会议>Data Mining, ICDM, 2008 8th IEEE International Conference on >On-line LDA: Adaptive Topic Models for Mining Text Streams with Applications to Topic Detection and Tracking
【24h】

On-line LDA: Adaptive Topic Models for Mining Text Streams with Applications to Topic Detection and Tracking

机译:在线LDA:用于挖掘文本流的自适应主题模型及其在主题检测和跟踪中的应用

获取原文

摘要

This paper presents Online Topic Model (OLDA), a topic model that automatically captures the thematic patterns and identifies emerging topics of text streams and their changes over time. Our approach allows the topic modeling framework, specifically the Latent Dirichlet Allocation (LDA) model, to work in an online fashion such that it incrementally builds an up-to-date model (mixture of topics per document and mixture of words per topic) when a new document (or a set of documents) appears. A solution based on the Empirical Bayes method is proposed. The idea is to incrementally update the current model according to the information inferred from the new stream of data with no need to access previous data. The dynamics of the proposed approach also provide an efficient mean to track the topics over time and detect the emerging topics in real time. Our method is evaluated both qualitatively and quantitatively using benchmark datasets. In our experiments, the OLDA has discovered interesting patterns by just analyzing a fraction of data at a time. Our tests also prove the ability of OLDA to align the topics across the epochs with which the evolution of the topics over time is captured. The OLDA is also comparable to, and sometimes better than, the original LDA in predicting the likelihood of unseen documents.
机译:本文介绍了在线主题模型(OLDA),这是一个主题模型,可自动捕获主题模式并识别文本流中新兴的主题及其随时间的变化。我们的方法允许主题建模框架(尤其是潜在Dirichlet分配(LDA)模型)以在线方式工作,以便在出现以下情况时以增量方式构建最新模型(每个文档中的主题混合以及每个主题中的单词混合)出现一个新文档(或一组文档)。提出了一种基于经验贝叶斯方法的解决方案。想法是根据从新数据流推断出的信息来增量更新当前模型,而无需访问先前的数据。所提出的方法的动态性还提供了一种有效的手段,可以随着时间的推移跟踪主题并实时检测新兴主题。我们使用基准数据集对我们的方法进行了定性和定量评估。在我们的实验中,OLDA通过一次分析一部分数据发现了有趣的模式。我们的测试还证明了OLDA能够跨时期协调主题的能力,从而捕获了主题随时间的演变。 OLDA在预测看不见的文档的可能性方面也可与原始LDA媲美,有时甚至比原始LDA更好。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号