首页> 外文会议>2011 IEEE Statistical Signal Processing Workshop >Entropic priors for hidden-Markov model classification
【24h】

Entropic priors for hidden-Markov model classification

机译:隐马尔可夫模型分类的熵先验

获取原文
获取外文期刊封面目录资料

摘要

In pattern classification problems lack of knowledge about the prior distribution is typically filled up with uniform priors. However this choice may lead to unsatisfactory inference results when the amount of observed data is scarce. The application of Maximum Entropy (ME) principle to prior determination results in the so-called en-tropic priors, which provide a much more cautious inference in comparison to uniform priors. The idea, introduced mainly within the context of theoretical physics, is applied here to signal processing scenarios. We derive efficient formulas for computing and updating entropic priors when the the likelihoods follow on Independent, Markov and Hidden Markov models and we apply them to a target-track classification task.
机译:在模式分类问题中,通常缺乏统一的先验知识,这些知识缺乏关于先验知识的知识。但是,当观察到的数据量不足时,这种选择可能导致推理结果不令人满意。将最大熵(ME)原理应用于先验确定会导致所谓的熵先验,与统一先验相比,它提供了更为谨慎的推断。主要在理论物理学的背景下引入的想法在这里应用于信号处理场景。当独立性,马尔可夫和隐马尔可夫模型的似然性遵循时,我们推导了用于计算和更新熵先验的有效公式,并将其应用于目标轨道分类任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号