首页> 外文OA文献 >Applying topic model in context-aware TV programs recommendation
【2h】

Applying topic model in context-aware TV programs recommendation

机译:在上下文感知电视节目推荐中应用主题模型

摘要

In IPTV systems, users’ watching behavior is influenced byudcontextual factors like time of day, day of week, Live/VOD condition etc., yet how to incorporate such factors into recommender depends on the choice of basic recommending model. In this paper, we apply a topic model in Information Retrieval (IR)–Latent Dirichlet Allocation (LDA) as the basic model in TV program recommender. What makes employingudsuch approach meaningful is the resemblance between user watching frequency as the entry in user-program matrix and term frequency in term-document matrix. In addition, we propose an extension to this useroriented LDA by adding a probabilistic selection node in this probabilisticudgraphical model to learn contextual influence and user’s individual inclination on different contextual factors.ududThe experiment using the proposed approach is conducted on the data from a web-based TV content delivery system “Vision”, which serves the campus users in Lancaster University. The experimental results show that both user-oriented LDA and context-aware LDA converge smoothly on perplexity regarding both iteration epoch and topic numbers under inference framework Gibbs Sampling. In addition, context-aware LDA can perform better than user-based LDA and baseline approach on both precision metrics and diversity metrics when the number of topic is over 50. Aside from that, programs with highest probability distribution within top 10 topics represent the natural clustering effect of applying this topic model in TV recommender.
机译:在IPTV系统中,用户的收看行为受诸如时间,星期几,实时/ VOD条件等非上下文因素的影响,但是如何将这些因素纳入推荐器取决于基本推荐模型的选择。在本文中,我们将信息检索(IR)–潜在狄利克雷分配(LDA)中的主题模型作为电视节目推荐器中的基本模型。使这种方法有意义的原因是用户观看频率(作为用户程序矩阵中的条目)与术语文档矩阵中的术语频率之间的相似之处。此外,我们建议在此概率/ udgraphic模型中添加一个概率选择节点,以学习此上下文影响和用户对不同上下文因素的个人偏好,以此扩展面向用户的LDA。 ud ud使用所提出的方法对数据进行实验来自基于网络的电视内容分发系统“ Vision”,该系统为兰卡斯特大学的校园用户提供服务。实验结果表明,在推理框架Gibbs Sampling下,面向用户的LDA和上下文感知的LDA都在迭代周期和主题编号方面的困惑上顺利收敛。此外,当主题数超过50时,上下文感知的LDA在精度指标和多样性指标上的性能都优于基于用户的LDA和基线方法。除此之外,前10个主题中概率分布最高的程序自然而然在电视推荐器中应用此主题模型的聚类效果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号