首页> 外文会议>Annual Conference and Expo of the Institute of Industrial Engineers >A Classification of Phi-Divergences for Data-Driven Stochastic Optimization
【24h】

A Classification of Phi-Divergences for Data-Driven Stochastic Optimization

机译:数据驱动随机优化的发散分类

获取原文

摘要

Phi-divergences (e.g., Kullback-Leibler, chi-squared distance, etc.) provide a measure of distance between two probability distributions. They can be used in data-driven stochastic optimization to create an ambiguity set of distributions that are centered around a nominal distribution and this ambiguity set of distributions is used for optimization. The nominal distribution is often determined by collected observations, expert opinions, simulations, etc. Given that there are many phi-divergences, a decision maker is left with the question of how each divergence behaves for his/her problem and which one to choose. In this paper, we present a classification of phi-divergences to elucidate their use for models with different properties and different sources of data. We provide examples of phi-divergences that result in commonly used risk models and discuss their behavior with respect to our classification. We refine our classification for a class of problems and further analyze each category in more detail. We illustrate the behavior of several commonly used phi-divergences in each classification category on a small power capacity expansion problem.
机译:披散度(例如Kullback-Leibler,卡方距离等)提供了两个概率分布之间距离的度量。它们可用于数据驱动的随机优化中,以创建围绕标称分布为中心的歧义分布集,并将此歧义分布集用于优化。名义分布通常是由收集的观察,专家意见,模拟等确定的。鉴于存在phi背离,决策者要留一个问题,即每个背离如何解决他/她的问题以及该选择哪个。在本文中,我们提出了phi散度的分类,以阐明其在具有不同属性和不同数据源的模型中的使用。我们提供了导致普遍使用的风险模型的phi偏差示例,并针对我们的分类讨论了它们的行为。我们针对一类问题优化分类,并进一步详细分析每个类别。我们在小功率扩展问题上说明了每个分类类别中几种常用的phi发散的行为。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号