首页> 外文会议>International Conference on Data Stream Mining Processing >New Approaches in the Learning of Complex-Valued Neural Networks
【24h】

New Approaches in the Learning of Complex-Valued Neural Networks

机译:复值神经网络学习的新方法

获取原文

摘要

We consider neural networks with complex weights and continuous activation functions. The complex generalization of the backpropagation learning algorithm is studied in the paper. We introduce new kinds of activation functions, namely, the complex modifications of the rational sigmoid and the ReLU activation function, the use of which has two main benefits. The first is that the application of these functions allows to avoid the splitting of transfer functions. The second is that they are fast to compute. In order to improve the performance and to increase the training speed we use the complex version of the modern optimizers instead of the classical techniques based on the application of gradient descent. The design of a complex-weighted neural networks for multiclass classification is also treated. The simulation results confirm the assumption that the combination of complex version of ReLU-like activation functions and Adam optimizer can considerably speed up the training of complex-valued neural networks.
机译:我们考虑具有复杂权重和连续激活函数的神经网络。本文研究了反向传播学习算法的复杂性。我们介绍了新的激活函数,即有理乙状结肠和ReLU激活函数的复杂修改,使用它们有两个主要好处。首先是这些功能的应用可以避免传递函数的分裂。第二个是它们计算速度快。为了提高性能并提高训练速度,我们使用现代优化器的复杂版本,而不是基于梯度下降的经典技术。还处理了用于多类分类的复杂加权神经网络的设计。仿真结果证实了这样的假设,即复杂版本的ReLU-like激活函数和Adam优化器的组合可以大大加快复杂值神经网络的训练速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号