首页> 外文会议>Annual Conference and Expo of the Institute of Industrial Engineers >A Classification of Phi-Divergences for Data-Driven Stochastic Optimization
【24h】

A Classification of Phi-Divergences for Data-Driven Stochastic Optimization

机译:数据驱动随机优化的PHI分歧的分类

获取原文

摘要

Phi-divergences (e.g., Kullback-Leibler, chi-squared distance, etc.) provide a measure of distance between two probability distributions. They can be used in data-driven stochastic optimization to create an ambiguity set of distributions that are centered around a nominal distribution and this ambiguity set of distributions is used for optimization. The nominal distribution is often determined by collected observations, expert opinions, simulations, etc. Given that there are many phi-divergences, a decision maker is left with the question of how each divergence behaves for his/her problem and which one to choose. In this paper, we present a classification of phi-divergences to elucidate their use for models with different properties and different sources of data. We provide examples of phi-divergences that result in commonly used risk models and discuss their behavior with respect to our classification. We refine our classification for a class of problems and further analyze each category in more detail. We illustrate the behavior of several commonly used phi-divergences in each classification category on a small power capacity expansion problem.
机译:PHI分歧(例如,KULLBACK-LEIBLER,CHI方向等)提供两个概率分布之间的距离的度量。它们可以用于数据驱动的随机优化,以创建围绕标称分布围绕标称分布居中的含糊的分布集,并且该模糊的分布集用于优化。鉴于有许多Phi分歧,据说,标称分布通常由收集的观察,专家意见,模拟等确定。决策者留下了每个发散对他/她的问题的表现以及哪一个选择的问题。在本文中,我们提出了PHI分歧的分类,以阐明它们对具有不同性质和不同数据来源的模型的用途。我们提供了Phi-Divergences的例子,导致常用的风险模型,并讨论他们的分类行为。我们对一类问题的分类进行了优化,并更详细地进一步分析每个类别。我们说明了在小功率容量扩张问题上的每个分类类别中常用的PHI分歧的行为。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号