首页> 外文会议>International Conference on Emerging Network Intelligence >An Evolutionary Training Algorithm for Artificial Neural Networks with Dynamic Offspring Spread and Implicit Gradient Information
【24h】

An Evolutionary Training Algorithm for Artificial Neural Networks with Dynamic Offspring Spread and Implicit Gradient Information

机译:一种具有动态后代扩展和隐式梯度信息的人工神经网络的进化训练算法

获取原文

摘要

Evolutionary training methods for Artificial Neural Networks can escape local minima. Thus, they are useful to train recurrent neural networks for short-term weather forecasting. However, these algorithms are not guaranteed to converge fast or even converge at all due to their stochastic nature. In this paper, we present an algorithm that uses implicit gradient information and is able to train existing individuals in order to create a dynamic reproduction probability density. It allows us to train and re-train an Artificial Neural Network supervised to forecast weather conditions.
机译:人工神经网络的进化训练方法可以逃避局部最小值。因此,它们可用于培训经常性的神经网络进行短期天气预报。然而,由于其随机性质,这些算法不保证迅速甚至偶然收敛。在本文中,我们提出了一种使用隐式梯度信息的算法,并且能够训练现有的个体以创造动态再现概率密度。它允许我们培训并重新训练监督的人工神经网络,以预测天气状况。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号