首页> 外文会议>Iberian conference on pattern recognition and image analysis >Robust Learning Algorithm for the Mixture of Experts
【24h】

Robust Learning Algorithm for the Mixture of Experts

机译:专家混合的鲁棒学习算法

获取原文

摘要

The Mixture of Experts model (ME) is a type of modular artificial neural network (MANN) whose architecture is composed by different kinds of networks who compete to learn different aspects of the problem. This model is used when the searching space is stratified. The learning algorithm of the ME model consists in estimating the network parameters to achieve a desired performance. To estimate the parameters, some distributional assumptions are made, so the learning algorithm and, consequently, the parameters obtained depends on the distribution. But when the data is exposed to outliers the assumption is not longer valid, the model is affected and is very sensible to the data as it is showed in this work. We propose a robust learning estimator by means of the generalization of the maximum likelihood estimator called M-estimator. Finally a simulation study is shown, where the robust estimator presents a better performance than the maximum likelihood estimator (MLE).
机译:专家模型(ME)的混合物是一种模块化人工神经网络(MANN),其架构由不同类型的网络组成,他们竞争学习问题的不同方面。在搜索空间分层时使用该模型。 ME模型的学习算法在于估计网络参数以实现所需的性能。为了估计参数,使一些分布假设进行,因此学习算法以及所获得的参数取决于分布。但是,当数据暴露于异常值时,假设不再有效,模型受到影响,并且在这项工作中显示数据非常明智。我们通过推广称为M估计器的最大似然估计器的概括,提出了一种强大的学习估计。最后,示出了仿真研究,其中稳健的估计器具有比最大似然估计器(MLE)更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号