首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Ensemble Learning in Fixed Expansion Layer Networks for Mitigating Catastrophic Forgetting
【24h】

Ensemble Learning in Fixed Expansion Layer Networks for Mitigating Catastrophic Forgetting

机译:固定扩展层网络中的集合学习,可减轻灾难性遗忘

获取原文
获取原文并翻译 | 示例

摘要

Catastrophic forgetting is a well-studied attribute of most parameterized supervised learning systems. A variation of this phenomenon, in the context of feedforward neural networks, arises when nonstationary inputs lead to loss of previously learned mappings. The majority of the schemes proposed in the literature for mitigating catastrophic forgetting were not data driven and did not scale well. We introduce the fixed expansion layer (FEL) feedforward neural network, which embeds a sparsely encoding hidden layer to help mitigate forgetting of prior learned representations. In addition, we investigate a novel framework for training ensembles of FEL networks, based on exploiting an information-theoretic measure of diversity between FEL learners, to further control undesired plasticity. The proposed methodology is demonstrated on a basic classification task, clearly emphasizing its advantages over existing techniques. The architecture proposed can be enhanced to address a range of computational intelligence tasks, such as regression problems and system control.
机译:灾难性的遗忘是大多数参数化监督学习系统中经过充分研究的属性。当非平稳输入导致丢失先前学习的映射时,在前馈神经网络的背景下会出现此现象的变体。文献中提出的缓解灾难性遗忘的大多数方案不是由数据驱动的,而且扩展性也不佳。我们介绍了固定扩展层(FEL)前馈神经网络,该网络嵌入了稀疏编码的隐藏层,以帮助减轻对先前学习的表示的遗忘。此外,我们在研究FEL学习者之间多样性的信息理论方法的基础上,研究了一种新型的FEL网络集成训练框架,以进一步控制不良的可塑性。在基本分类任务上论证了所提出的方法,清楚地强调了它比现有技术的优势。可以对提出的体系结构进行增强,以解决一系列计算智能任务,例如回归问题和系统控制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号