首页> 外文会议>International Conference on Soft Computing and Intelligent Systems;International Symposium on Advanced Intelligent Systems >Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence
【24h】

Unsupervised weight parameter estimation for exponential mixture distribution based on symmetric Kullback-Leibler divergence

机译:基于对称klullback-Leibler发散的指数混合分布的无监督权重参数估计

获取原文
获取外文期刊封面目录资料

摘要

When there are multiple component predictors, it is promising to integrate them into one predictor for advanced reasoning. If each component predictor is given as a stochastic model in the form of probability distribution, an exponential mixture of the component probability distributions provides a good way to integrate them. However, weight parameters used in the exponential mixture model are difficult to estimate if there is no data for performance evaluation. As a suboptimal way to solve this problem, weight parameters may be estimated so that the exponential mixture model should be a balance point that is defined as an equilibrium point with respect to the distance from/to all component probability distributions. In this paper, we propose a weight parameter estimation method that represents this concept using a symmetric Kullback-Leibler divergence and discuss the features of this method.
机译:当有多个组件预测因子时,很有希望将它们集成到一个预测器中以进行高级推理。 如果每个组分预测器以概率分布的形式作为随机模型,则组件概率分布的指数混合提供了整合它们的好方法。 然而,如果没有用于性能评估的数据,则难以估计指数混合模型中使用的权重参数。 作为解决这个问题的次优方法,可以估计权重参数,使得指数混合模型应该是相对于距离/迄今为止距离的距离的平衡点。 在本文中,我们提出了一种权重参数估计方法,其使用对称kullback-leibler发散表示该概念,并讨论该方法的特征。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号