首页> 外文学位 >Generalized method of moments algorithm for learning mixtures of Plackett-Luce models.
【24h】

Generalized method of moments algorithm for learning mixtures of Plackett-Luce models.

机译:学习Plackett-Luce模型混合的矩量算法的通用方法。

获取原文
获取原文并翻译 | 示例

摘要

It is often the primary interest of certain voting systems to be able to generate an aggregate ranking over a set of candidates or alternatives from the preferences of individual agents or voters. The Plackett-Luce model is one of the most studied models for statistically describing discrete choice ordinal preferences and summarizing rank data in the machine learning subarea of rank aggregation which has also been widely researched from the approach of problems in computational social choice. Much work has been done by the machine learning community in developing algorithms to efficiently estimate Plackett-Luce model parameters with wide-ranging real-world applications of rank data in e-commerce and political science, such as meta-search engines, consumer products rankings, and presidential elections. In machine learning tasks, a mixture of models can sometimes better fit the data more closely than a single model alone, and so naturally, mixtures of Plackett-Luce models can also confer the same benefits for rank data.;A major obstacle in learning the parameters of mixture models is the identifiability of the models which is necessary in order to be able to make correct, meaningful inferences from the learned parameters. Without identifiability, it becomes impossible even to estimate the parameters in certain cases. Using breakthrough results on the identifiability of mixtures of Plackett-Luce, we propose an efficient generalized method of moments (GMM) algorithm to learn mixtures of Plackett-Luce models and compare it to an existing expectation maximization (EM) algorithm. We outline the overall approach of GMM and the selection of the moment conditions used by our algorithm in estimating the ground-truth parameters. Next, we describe the design and implementation details of our GMM algorithm and present both theory and experiments that show it to be significantly faster than the EM algorithm while achieving competitive statistical efficiency. Finally, we discuss the implications of the identifiability results and our algorithm on future work in extending both for learning mixtures of Plackett-Luce models from big rank data.
机译:能够根据单个代理人或选民的偏好对一组候选人或备选方案生成汇总排名,通常是某些投票系统的主要兴趣所在。 Plackett-Luce模型是研究最多的模型之一,用于统计描述离散选择的顺序偏好并汇总排名聚合的机器学习子区域中的排名数据,这也已从计算社会选择的问题方法中得到了广泛研究。机器学习社区在开发算法以有效估计Plackett-Luce模型参数方面进行了许多工作,并在电子商务和政治科学中广泛使用了排名数据在现实世界中的应用,例如元搜索引擎,消费产品排名和总统选举。在机器学习任务中,混合模型有时比单独使用单个模型更能更好地拟合数据,因此自然地,混合Plackett-Luce模型也可以为排名数据带来相同的好处。混合模型的参数是模型的可识别性,这是必需的,以便能够从学习的参数中得出正确,有意义的推论。如果没有可识别性,则在某些情况下甚至无法估计参数。利用突破性结果对Plackett-Luce混合物的可识别性,我们提出了一种有效的矩量广义矩(GMM)算法,以学习Plackett-Luce模型的混合物,并将其与现有的期望最大化(EM)算法进行比较。我们概述了GMM的整体方法以及我们的算法在估算地面真相参数时使用的力矩条件的选择。接下来,我们描述GMM算法的设计和实现细节,并提供理论和实验,以证明它在实现竞争统计效率的同时,比EM算法快得多。最后,我们讨论了可识别性结果和我们的算法在扩展从大型数据中学习Plackett-Luce模型混合模型的未来工作中的意义。

著录项

  • 作者

    Piech, Peter D.;

  • 作者单位

    Rensselaer Polytechnic Institute.;

  • 授予单位 Rensselaer Polytechnic Institute.;
  • 学科 Computer science.
  • 学位 M.S.
  • 年度 2016
  • 页码 30 p.
  • 总页数 30
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号