【24h】

Evidence-based mixture of MLP-experts

机译:基于证据的MLP专家混合物

获取原文

摘要

Mixture of Experts (ME) is a modular neural network architecture for supervised learning. In this paper, we propose an evidence-based ME to deal with the classification problem. In the basic form of ME the problem space is automatically divided into several subspaces for the experts and the outputs of experts are combined by a gating network. Satisfactory performance of the basic ME depends on the diversity among experts. In conventional ME, different initialization of experts and supervision of the gating network during the learning procedure, provide the diversity. The main idea of our proposed method is to employ the Dempster-Shafer (D-S) theory of evidence to improve determination of learning parameters (which results more diverse experts) and the way of combining experts' decisions. Experimental results with some data sets from UCI repository show that our proposed method yields better classification rates as compared to basic ME and static combining of neural network based on D-S theory.
机译:专家(ME)的混合物是用于监督学习的模块化神经网络架构。在本文中,我们提出了一份以证据为基础,以处理分类问题。在我的基本形式中,问题空间会自动分为专家的几个子空间,专家的产出由门控网络组合。基本我的令人满意的表现取决于专家之间的多样性。在传统的ME中,在学习程序期间,不同的初始化专家和门控网络的监督,提供多样性。我们提出的方法的主要思想是雇用Dempster-Shafer(D-S)的证据理论,以改善学习参数的确定(结果更加多样化的专家)和结合专家决策的方式。来自UCI存储库的一些数据集的实验结果表明,与基于D-S理论的神经网络的基础和静态组合,我们所提出的方法产生更好的分类速率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号