首页> 外文期刊>IEEE Transactions on Neural Networks >Reduced Pattern Training Based on Task Decomposition Using Pattern Distributor
【24h】

Reduced Pattern Training Based on Task Decomposition Using Pattern Distributor

机译:使用模式分配器基于任务分解的减少模式训练

获取原文
获取原文并翻译 | 示例

摘要

Task decomposition with pattern distributor (PD) is a new task decomposition method for multilayered feedforward neural networks (NNs). Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named reduced pattern training (RPT) is also introduced, aiming to improve the performance of pattern distribution. Our analysis and the experimental results show that RPT improves the performance of pattern distributor network significantly. The distributor module''s classification accuracy dominates the whole network''s performance. Two combination methods, namely, crosstalk-based combination and genetic-algorithm (GA)-based combination, are presented to find suitable grouping for the distributor module. Experimental results show that this new method can reduce training time and improve network generalization accuracy when compared to a conventional method such as constructive backpropagation or a task decomposition method such as output parallelism (OP).
机译:使用模式分配器(PD)进行任务分解是多层前馈神经网络(NNs)的一种新的任务分解方法。提出了实现这种新的任务分解方法的模式分配器网络。我们提出了一个理论模型来分析模式分配器网络的性能。还引入了一种称为缩减模式训练(RPT)的方法,旨在提高模式分布的性能。我们的分析和实验结果表明,RPT显着提高了模式分配器网络的性能。分配器模块的分类精度决定着整个网络的性能。提出了两种组合方法,即基于串扰的组合和基于遗传算法(GA)的组合,以找到适用于分配器模块的分组。实验结果表明,与传统方法(例如,构造性反向传播)或任务分解方法(例如,输出并行性(OP))相比,该新方法可以减少训练时间并提高网络泛化精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号