首页> 外文期刊>European Journal of Operational Research >Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models
【24h】

Decomposition algorithm for distributionally robust optimization using Wasserstein metric with an application to a class of regression models

机译:使用Wassersein度量使用应用于一类回归模型的分解算法

获取原文
获取原文并翻译 | 示例
           

摘要

We study distributionally robust optimization (DRO) problems where the ambiguity set is defined using the Wasserstein metric and can account for a bounded support. We show that this class of DRO problems can be reformulated as decomposable semi-infinite programs. We use a cutting-surface method to solve the reformulated problem for the general nonlinear model, assuming that we have a separation oracle. As examples, we consider the problems arising from the machine learning models where variables couple with data in a bilinear form in the loss function. We present a branch-and-bound algorithm to solve the separation problem for this case using an iterative piece-wise linear approximation scheme. We use a distributionally robust generalization of the logistic regression model to test our algorithm. We also show that it is possible to approximate the logistic-loss function with significantly less linear pieces than those needed for a general loss function to achieve a given accuracy when generating a cutting surface. Numerical experiments on the distributionally robust logistic regression models show that the number of oracle calls are typically 20-50 to achieve 5-digit precision. The solution found by the model has better predicting power than classical logistic regression when the sample size is small. (C) 2019 Published by Elsevier B.V.
机译:我们研究了使用Wassersein度量标准定义了歧义集的分布稳健优化(DRO)问题,并且可以解释有界支持。我们表明这类DRO问题可以重新重整为可分解的半无限计划。我们使用切割表面方法来解决一般非线性模型的重新设计问题,假设我们有一个分离甲骨文。作为示例,我们考虑机器学习模型引起的问题,其中变量与丢失功能中的双线性形式的数据耦合。我们介绍了一种分支和绑定算法,用于解决这种情况的分离问题,使用迭代片性线性近似方案。我们使用Logistic回归模型的分布鲁棒泛化来测试我们的算法。我们还表明,在产生切割表面时实现给定精度的一般损失功能的那些,可以用明显更少的线性碎片近似逻辑损耗函数。分布鲁棒逻辑回归模型的数值实验表明,Oracle呼叫数量通常为20-50,以实现5位精度。当模型中发现的解决方案比样本大小小的古典逻辑回归更好地预测功率。 (c)2019年由elestvier b.v发布。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号