首页> 外文会议>Algorithmic learning theory >Accelerated Training of Max-Margin Markov Networks with Kernels
【24h】

Accelerated Training of Max-Margin Markov Networks with Kernels

机译:利用核加快Max-Margin Markov网络的训练

获取原文
获取原文并翻译 | 示例

摘要

Structured output prediction is an important machine learning problem both in theory and practice, and the max-margin Markov network (M~3N) is an effective approach. All state-of-the-art algorithms for optimizing M~3N objectives take at least O(l/ε) number of iterations to find an ε accurate solution. [1] broke this barrier by proposing an excessive gap reduction technique (EGR) which converges in O(l/ε~(1/ε)) iterations. However, it is restricted to Euclidean projections which consequently requires an intractable amount of computation for each iteration when applied to solve M~3N. In this paper, we show that by extending EGR to Bregman projection, this faster rate of convergence can be retained, and more importantly, the updates can be performed efficiently by exploiting graphical model factorization. Further, we design a kernelized procedure which allows all computations per iteration to be performed at the same cost as the state-of-the-art approaches.
机译:结构化输出预测在理论和实践上都是重要的机器学习问题,最大裕度马尔可夫网络(M〜3N)是一种有效的方法。所有用于优化M〜3N个目标的最新算法都至少需要O(l /ε)次迭代才能找到ε精确解。 [1]通过提出一种以O(l /ε〜(1 /ε))迭代收敛的过度间隙减少技术(EGR)打破了这一障碍。但是,它局限于欧几里得投影,因此当应用于求解M〜3N时,每次迭代都需要大量的计算量。在本文中,我们表明,通过将EGR扩展到Bregman投影,可以保持较快的收敛速度,更重要的是,可以利用图形化模型分解有效地执行更新。此外,我们设计了一种内核化程序,该程序允许以与最新技术相同的成本执行每次迭代的所有计算。

著录项

  • 来源
    《Algorithmic learning theory》|2011年|p.292-307|共16页
  • 会议地点 Espoo(FI);Espoo(FI)
  • 作者单位

    Department of Computing Science, University of Alberta, Edmonton, Canada;

    Department of Computer Science, University of Chicago, Chicago, IL, USA;

    Department of Statistics and Computer Science, Purdue University, IN, USA;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 人工智能理论;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号