首页> 外文期刊>Knowledge-Based Systems >Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches
【24h】

Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches

机译:基于单变量动态编码算法的深度神经网络超参数优化

获取原文
获取原文并翻译 | 示例

摘要

This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure: furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models: an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology. (C) 2019 Elsevier B.V. All rights reserved.
机译:本文提出了一种使用单变量动态编码算法进行搜索的方法,用于寻找深层神经网络的超参数调整。对于这样的神经网络优化超参数是困难的,因为具有多个参数需要配置的神经网络:此外,这种网络的训练速度很慢。该方法针对两种神经网络模型进行了测试:自动编码器和具有改进的美国国家标准技术研究院(MNIST)数据集的卷积神经网络。为了用所提出的方法优化超参数,选择成本函数作为自动编码器的解码值和原始图像之间的差的平均值,以及卷积神经网络的评估精度的倒数。使用所提出的方法对超参数进行优化,收敛速度快,计算资源少,并将结果与​​其他考虑的优化算法(即模拟退火,遗传算法和粒子群算法)的结果进行比较,以证明该方法的有效性。建议的方法。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号