首页> 外文会议>IEEE workshop on neural networks for signal processing >Classification using hierarchical mixtures of experts
【24h】

Classification using hierarchical mixtures of experts

机译:使用专家的分层混合物进行分类

获取原文

摘要

There has recently been widespread interest in the use of multiple models for classification and regression in the statistics and neural networks communities. The hierarchical mixture of experts (HME) has been successful in a number of regression problems, yielding significantly faster training through the use of the expectation maximisation algorithm. In this paper we extend the HME to classification and results are reported for three common classification benchmark tests: exclusive-OR, N-input parity and two spirals.
机译:最近在统计和神经网络社区中使用多种模型的使用普遍兴趣。专家(HME)的分层混合物在许多回归问题中取得了成功,通过使用预期最大化算法产生明显更快的培训。在本文中,我们将HME扩展到分类,报告了三个常见分类基准测试的结果:独家或N型输入奇偶校验和两个螺旋。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号