...
首页> 外文期刊>Data mining and knowledge discovery >Parameter learning in hybrid Bayesian networks using prior knowledge
【24h】

Parameter learning in hybrid Bayesian networks using prior knowledge

机译:使用先验知识的混合贝叶斯网络中的参数学习

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Mixtures of truncated basis functions have been recently proposed as a generalisation of mixtures of truncated exponentials and mixtures of polynomials for modelling univariate and conditional distributions in hybrid Bayesian networks. In this paper we analyse the problem of learning the parameters of marginal and conditional MoTBF densities when both prior knowledge and data are available. Incorporating prior knowledge provide a valuable tool for obtaining useful models, especially in domains of applications where data are costly or scarce, and prior knowledge is available from practitioners. We explore scenarios where the prior knowledge can be expressed as an MoTBF density that is afterwards combined with another MoTBF density estimated from the available data. The resulting model remains within the MoTBF class which is a convenient property from the point of view of inference in hybrid Bayesian networks. The performance of the proposed method is tested in a series of experiments carried out over synthetic and real data.
机译:截断基函数的混合最近已被提出来作为截断指数混合的多项式和多项式混合的一般化,用于在混合贝叶斯网络中建模单变量和条件分布。在本文中,我们分析了既有先验知识又有数据的情况下学习边际和条件MoTBF密度参数的问题。合并先验知识为获得有用的模型提供了有价值的工具,尤其是在数据昂贵或稀缺且从业者可以获取先验知识的应用领域中。我们探索了将先验知识表示为MoTBF密度,然后将其与根据可用数据估算的另一MoTBF密度相结合的方案。所得模型仍保留在MoTBF类中,从混合贝叶斯网络的推断角度来看,这是一个方便的属性。在合成和真实数据上进行的一系列实验中测试了该方法的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号