首页> 美国卫生研究院文献>other >Dual Decomposed Learning with Factorwise Oracles for Structural SVMs of Large Output Domain
【2h】

Dual Decomposed Learning with Factorwise Oracles for Structural SVMs of Large Output Domain

机译:大因数域结构SVM的因子分解Oracle双重分解学习

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Many applications of machine learning involve structured outputs with large domains, where learning of a structured predictor is prohibitive due to repetitive calls to an expensive inference oracle. In this work, we show that by decomposing training of a Structural Support Vector Machine (SVM) into a series of multiclass SVM problems connected through messages, one can replace an expensive structured oracle with Factorwise Maximization Oracles (FMOs) that allow efficient implementation of complexity sublinear to the factor domain. A Greedy Direction Method of Multiplier (GDMM) algorithm is then proposed to exploit the sparsity of messages while guarantees convergence to ε sub-optimality after O(log(1/ε)) passes of FMOs over every factor. We conduct experiments on chain-structured and fully-connected problems of large output domains, where the proposed approach is orders-of-magnitude faster than current state-of-the-art algorithms for training Structural SVMs.
机译:机器学习的许多应用都涉及具有大范围的结构化输出,其中由于重复调用昂贵的推理预言而禁止学习结构化预测变量。在这项工作中,我们表明,通过将结构化支持向量机(SVM)的训练分解为一系列通过消息连接的多类SVM问题,可以用因数最大化Oracle(FMO)代替昂贵的结构化Oracle,从而可以有效地实现复杂性与因子域次线性。然后提出了一种贪婪方向乘数法(GDMM)算法,以利用消息的稀疏性,同时确保在每个要素上FMO经过O(log(1 /ε))后,收敛到ε次优。我们对大型输出域的链式结构和全连接问题进行了实验,其中所提出的方法比训练结构SVM的当前最新算法要快几个数量级。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号