首页> 外文期刊>Monthly weather review >Kullback-Leibler Divergence as a Forecast Skill Score with Classic Reliability-Resolution-Uncertainty Decomposition
【24h】

Kullback-Leibler Divergence as a Forecast Skill Score with Classic Reliability-Resolution-Uncertainty Decomposition

机译:Kullback-Leibler Divergence as a Forecast Skill Score with Classic Reliability-Resolution-Uncertainty Decomposition

获取原文
获取原文并翻译 | 示例
       

摘要

This paper presents a score that can be used for evaluating probabilistic forecasts of multicategory events. The score is a reinterpretation of the logarithmic score or ignorance score, now formulated as the relative entropy or Kullback-Leibler divergence of the forecast distribution from the observation distribution. Using the information-theoretical concepts of entropy and relative entropy, a decomposition into three components is presented, analogous to the classic decomposition of the Brier score. The information-theoretical twins of the components uncertainty, resolution, and reliability provide diagnostic information about the quality of forecasts. The overall score measures the information conveyed by the forecast. As was shown recently, information theory provides a sound framework for forecast verification. The new decomposition, which has proven to be very useful for the Brier score and is widely used, can help acceptance of the logarithmic score in meteorology.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号