首页> 外文期刊>International journal of machine learning and cybernetics >A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems
【24h】

A competitive swarm optimizer with hybrid encoding for simultaneously optimizing the weights and structure of Extreme Learning Machines for classification problems

机译:具有混合编码的竞争性群优化器,用于同时优化用于分类问题的极端学习机的权重和结构

获取原文
获取原文并翻译 | 示例
           

摘要

Extreme Learning Machine (ELM) is a learning algorithm proposed recently to train single hidden layer feed forward networks (SLFN). It has many attractive properties that include better generalization performance and very fast learning. ELM starts by assigning random values to the input weights and hidden biases and then in one step it determines the output weights using Moore-Penrose generalized inverse. Despite the aforementioned advantages, ELM performance might be affected by the random initialization of weights and biases or by the large generated network which might contain unnecessary number of neurons. In order to increase the generalization performance and to produce more compact networks, a hybrid model that combines ELM with competitive swarm optimizer (CSO) is proposed in this paper. The proposed model (CSONN-ELM) optimizes the weights and biases and dynamically determines the most appropriate number of neurons. To evaluate the effectiveness of the CSONN-ELM, it is experimented using 23 benchmark datasets, and compared to a set of static rules extracted from literature that are used to determine the number of neurons of SLFN. Moreover, it is compared to two dynamic methods that are used to enhance the performance of ELM, that are Optimally pruned ELM (OP-ELM) and metaheuristic based ELMs (Particle Swarm Optimization-ELM and Differential Evolution-ELM). The obtained results show that the proposed method enhances the generalization performance of ELM and overcomes the static and dynamic methods.
机译:极端学习机(ELM)是最近提出的一种学习算法,用于训练单个隐藏层馈送前向网络(SLFN)。它具有许多有吸引力的属性,包括更好的泛化性能和非常快速的学习。 ELM通过将随机值分配给输入权重和隐藏偏差,然后在一步中使用Moore-PenRose广义逆确定输出权重。尽管具有上述优点,但榆树性能可能受到权重和偏置的随机初始化或由可能包含不必要数量的神经元的大量网络的影响。为了提高泛化性能并产生更紧凑的网络,本文提出了一种将ELM与竞争性群优化器(CSO)结合的混合模型。所提出的模型(CSONN-ELM)优化了权重和偏置,并动态地确定最合适数量的神经元。为了评估CSON-ELM的有效性,它通过23个基准数据集进行实验,并与从用于确定SLFN的神经元数的文献中提取的一组静态规则相比。此外,它将其与用于增强ELM性能的两种动态方法进行比较,这是最佳地修剪的ELM(OP-ELM)和基于成血管素的ELM(粒子群优化 - ELM和差示eVolution-ELM)。所得结果表明,该方法提高了ELM的泛化性能,克服了静态和动态方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号