...
首页> 外文期刊>Journal of Computers >Impact of Different Random Initializations on Generalization Performance of Extreme Learning Machine
【24h】

Impact of Different Random Initializations on Generalization Performance of Extreme Learning Machine

机译:不同随机初始化对极端学习机泛化性能的影响

获取原文

摘要

The generalization performance of extreme learning machine (ELM) is influenced by the random initializations to input-layer weights and hidden-layer biases. In this paper, we demonstrate this conclusion through testing the classification accuracies of ELMs corresponding to different random initializations. 30 UCI data sets and 24 continuous probability distributions are employed in this experimental study. The final results present the following important and valuable observations and conclusions, i.e., (1) the probability distributions with symmetrical and bell-shaped probability density functions (e.g., Hyperbolic Secant, Student's-t, Laplace and Normal) always bring about the higher training accuracies and easily cause the over-fitting of ELM; (2) ELMs with random input-layer weights and hidden-layer biases chosen from heavy-tailed distributions (e.g., Gamma, Rayleigh and Frechet) have the better generalization performances; and (3) the light-tailed distributions (e.g., Central Chi-Squared, Erlang, F, Gumbel and Logistic) are usually unsuited to initialize the input-layer weights and hidden-layer biases for ELM. All these provide the useful enlightenments for practical applications of ELMs in different fields.
机译:极端学习机(ELM)的泛化性能受到输入层权重和隐藏层偏差的随机初始化的影响。在本文中,我们通过测试对应于不同随机初始化的elm的分类精度来证明这一结论。在该实验研究中采用了30个UCI数据集和24个连续概率分布。最终结果呈现以下重要和有价值的观察和结论,即(1)具有对称性和钟形概率密度函数的概率分布(例如,双曲线,学生-T,LAPPALL和正常)始终带来更高的培训精度且容易引起榆树的过度贴合; (2)具有随机输入层重量的elm和从重型分布(例如,伽马,瑞利和frechet)中选择的隐藏层偏置具有更好的泛化性能; (3)浅尾分布(例如,中央Chi Squared,Erlang,F,Gumbel和Logistic)通常是不合适的,以初始化ELM的输入层权重和隐藏层偏差。所有这些都为ELMS在不同领域的实际应用提供了有用的启示。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号