...
首页> 外文期刊>ACM transactions on mathematical software >HyperNOMAD: Hyperparameter Optimization of Deep Neural Networks Using Mesh Adaptive Direct Search
【24h】

HyperNOMAD: Hyperparameter Optimization of Deep Neural Networks Using Mesh Adaptive Direct Search

机译:超纳米:使用网格自适应直接搜索深神经网络的封锁优化

获取原文
获取原文并翻译 | 示例

摘要

The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process. When facing a new application, tuning a deep neural network is a tedious and time-consuming process that is often described as a "dark art." This explains the necessity of automating the calibration of these hyperparameters. Derivative-free optimization is a field that develops methods designed to optimize time-consuming functions without relying on derivatives. This work introduces the HyperNOMAD package, an extension of the NOMAD software that applies the MADS algorithm [7] to simultaneously tune the hyperparameters responsible for both the architecture and the learning process of a deep neural network (DNN). This generic approach allows for an important flexibility in the exploration of the search space by taking advantage of categorical variables. HyperNOMAD is tested on the MNIST, Fashion-MNIST, and CIFAR-10 datasets and achieves results comparable to the current state of the art.
机译:深度神经网络的性能对定义网络结构和学习过程的超参数的选择非常敏感。在面对新的应用时,调整深度神经网络是一种繁琐且耗时的过程,通常被描述为“黑暗艺术”。这解释了自动化这些超参数的校准的必要性。无衍生优化是一种开发旨在优化耗时功能而无需依赖衍生物的方法的字段。这项工作介绍了一个超导体包,一个应用Mads算法[7]的Nomad软件的扩展,同时调整负责深度神经网络(DNN)的架构和学习过程负责的超参数。这种通用方法可以通过利用分类变量来探索搜索空间的重要灵活性。在Mnist,Fashion-Mnist和CiFar-10数据集上测试了高纳米,并实现了与现有技术相当的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号