首页> 外文会议>Annual conference on Neural Information Processing Systems >Dual Decomposed Learning with Factorwise Oracles for Structural SVMs of Large Output Domain
【24h】

Dual Decomposed Learning with Factorwise Oracles for Structural SVMs of Large Output Domain

机译:大因数域结构SVM的因子分解Oracle双重分解学习

获取原文

摘要

Many applications of machine learning involve structured outputs with large domains, where learning of a structured predictor is prohibitive due to repetitive calls to an expensive inference oracle. In this work, we show that by decomposing training of a Structural Support Vector Machine (SVM) into a series of multiclass SVM problems connected through messages, one can replace an expensive structured oracle with Factorwise Maximization Oracles (FMOs) that allow efficient implementation of complexity sublinear to the factor domain. A Greedy Direction Method of Multiplier (GDMM) algorithm is then proposed to exploit the sparsity of messages while guarantees convergence to є sub-optimality after O(log(1/є)) passes of FMOs over every factor. We conduct experiments on chain-structured and fully-connected problems of large output domains, where the proposed approach is orders-of-magnitude faster than current state-of-the-art algorithms for training Structural SVMs.
机译:机器学习的许多应用都涉及具有大域的结构化输出,其中由于重复调用昂贵的推理预言而禁止学习结构化预测变量。在这项工作中,我们表明,通过将结构化支持向量机(SVM)的训练分解为一系列通过消息连接的多类SVM问题,可以用因数最大化Oracle(FMO)代替昂贵的结构化Oracle,从而可以高效地实现复杂性对因子域而言是次线性的。然后提出了一种贪婪方向乘数法(GDMM)算法,以利用消息的稀疏性,同时确保FMO在每个因子上经过O(log(1 /є))次后都收敛到є次优。我们对大型输出域的链式结构和完全连接问题进行了实验,其中所提出的方法比训练结构SVM的当前最新算法要快几个数量级。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号