首页> 外文期刊>Knowledge-Based Systems >Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches
【24h】

Hyperparameter optimization of deep neural network using univariate dynamic encoding algorithm for searches

机译:使用单变量动态编码算法进行深度神经网络的封闭表优化

获取原文
获取原文并翻译 | 示例

摘要

This paper proposes a method to find the hyperparameter tuning for a deep neural network by using a univariate dynamic encoding algorithm for searches. Optimizing hyperparameters for such a neural network is difficult because the neural network that has several parameters to configure: furthermore, the training speed for such a network is slow. The proposed method was tested for two neural network models: an autoencoder and a convolution neural network with the Modified National Institute of Standards and Technology (MNIST) dataset. To optimize hyperparameters with the proposed method, the cost functions were selected as the average of the difference between the decoded value and the original image for the autoencoder, and the inverse of the evaluation accuracy for the convolution neural network. The hyperparameters were optimized using the proposed method with fast convergence speed and few computational resources, and the results were compared with those of the other considered optimization algorithms (namely, simulated annealing, genetic algorithm, and particle swarm algorithm) to show the effectiveness of the proposed methodology. (C) 2019 Elsevier B.V. All rights reserved.
机译:本文提出了一种通过使用单变量动态编码算法来查找深度神经网络的方法,用于搜索。优化这种神经网络的超级参数是困难的,因为具有若干参数来配置的神经网络:此外,这种网络的训练速度很慢。测试了两个神经网络模型的方法:具有修改的国家标准和技术研究所(MNIST)数据集的AutoEncoder和卷积神经网络。为了利用所提出的方法优化HyperParameters,将成本函数选择为AutoEncoder的解码值和原始图像之间的平均值,以及卷积神经网络的评估精度的倒数。利用具有快速收敛速度和少量计算资源的提出的方法优化了超级参数,并将结果与​​其他考虑的优化算法(即模拟退火,遗传算法和粒子群算法)进行了比较,以显示了效果提出的方法。 (c)2019 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号