首页> 外文OA文献 >Bayesian methods for non-gaussian data modeling and applications
【2h】

Bayesian methods for non-gaussian data modeling and applications

机译:非高斯数据建模的贝叶斯方法及其应用

摘要

Finite mixture models are among the most useful machine learning techniques and are receiving considerable attention in various applications. The use of finite mixture models in image and signal processing has proved to be of considerable interest in terms of both theoretical development and in their usefulness in several applications. In most of the applications, the Gaussian density is used in the mixture modeling of data. Although a Gaussian mixture may provide a reasonable approximation to many real-world distributions, it is certainly not always the best approximation especially in image and signal processing applications where we often deal with non-Gaussian data. In this thesis, we propose two novel approaches that may be used in modeling non-Gaussian data. These approaches use two highly flexible distributions, the generalized Gaussian distribution (GGD) and the general Beta distribution, in order to model the data. We are motivated by the fact that these distributions are able to fit many distributional shapes and then can be considered as a useful class of flexible models to address several problems and applications involving measurements and features having well-known marked deviation from the Gaussian shape. For the mixture estimation and selection problem, researchers have demonstrated that Bayesian approaches are fully optimal. The Bayesian learning allows the incorporation of prior knowledge in a formal coherent way that avoids overfitting problems. For this reason, we adopt different Bayesian approaches in order to learn our models parameters. First, we present a fully Bayesian approach to analyze finite generalized Gaussian mixture models which incorporate several standard mixtures, such as Laplace and Gaussian. This approach evaluates the posterior distribution and Bayes estimators using a Gibbs sampling algorithm, and selects the number of components in the mixture using the integrated likelihood. We also propose a fully Bayesian approach for finite Beta mixtures learning using a Reversible Jump Markov Chain Monte Carlo (RJMCMC) technique which simultaneously allows cluster assignments, parameters estimation, and the selection of the optimal number of clusters. We then validate the proposed methods by applying them to different image processing applications.
机译:有限的混合模型是最有用的机器学习技术之一,并且在各种应用中受到相当大的关注。在理论发展及其在几种应用中的实用性方面,在图像和信号处理中使用有限混合模型已被证明引起了极大的兴趣。在大多数应用中,高斯密度用于数据的混合建模。尽管高斯混合可以提供许多现实世界分布的合理近似值,但它不一定总是最佳近似值,尤其是在我们经常处理非高斯数据的图像和信号处理应用中。在本文中,我们提出了两种可用于非高斯数据建模的新颖方法。这些方法使用两个高度灵活的分布,即广义高斯分布(GGD)和常规Beta分布,以对数据进行建模。我们受到以下事实的激励:这些分布能够适合许多分布形状,然后可以被视为一类有用的灵活模型,以解决涉及测量和特征的若干问题和应用,这些测量和特征与高斯形状​​有明显的明显偏差。对于混合估计和选择问题,研究人员已证明贝叶斯方法是完全最优的。贝叶斯学习允许以形式上连贯的方式合并先验知识,从而避免过度拟合的问题。因此,我们采用不同的贝叶斯方法来学习模型参数。首先,我们提出一种完全贝叶斯方法来分析有限广义广义高斯混合模型,该模型结合了几种标准混合物,例如拉普拉斯和高斯。这种方法使用吉布斯(Gibbs)采样算法评估后验分布和贝叶斯估计量,并使用综合似然来选择混合物中的组分数。我们还提出了一种使用可逆跳跃马尔可夫链蒙特卡洛(RJMCMC)技术进行有限Beta混合学习的完全贝叶斯方法,该方法同时允许进行聚类分配,参数估计以及最佳聚类数的选择。然后,我们通过将它们应用于不同的图像处理应用程序来验证所提出的方法。

著录项

  • 作者

    Elguebaly Tarek;

  • 作者单位
  • 年度 2009
  • 总页数
  • 原文格式 PDF
  • 正文语种 en
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号