【24h】

Global optimization of neural network

机译:神经网络的全局优化

获取原文
获取原文并翻译 | 示例

摘要

The local optimization problem of cost functions has blocked the advancement of not only the traditional mathematics but also the emerging Neural Network (NN). Since most of the cost functions used in the gradient descent of NN is non-convex, there are high possibilities that the algorithm may converge at the local optimum. In order to eliminate the substantial error caused from this problem, deep learning has been introduced. Yet, standard computing environment cannot possibly power calculation that deep learning requires. The fake alpha algorithm, newly introduced in this paper, shares the same goal with the deep learning algorithm, but can successfully be operated in standard computers. Additionally, the heuristic approach that the fake alpha algorithm takes in the gradient descent process enhances the efficiency of NN.
机译:成本函数的局部优化问题不仅阻碍了传统数学的发展,而且也阻碍了新兴的神经网络(NN)的发展。由于在NN的梯度下降中使用的大多数成本函数都是非凸的,因此该算法极有可能收敛于局部最优值。为了消除由该问题引起的实质性错误,已经引入了深度学习。但是,标准计算环境可能无法支持深度学习所需的计算。本文中新引入的伪α算法与深度学习算法具有相同的目标,但可以在标准计算机上成功运行。此外,伪Alpha算法在梯度下降过程中采用的启发式方法提高了NN的效率。

著录项

  • 来源
  • 会议地点 Chuncheon-si Gangwon-do(KR)
  • 作者单位

    Department of Natural Science, Hankuk Academy of Foreign Studies, 50, Oedae-ro 54beon-gil, Mohyeon-myeon, Cheoin-gu, Yongin-si, Gyeonggi-do, Republic of Korea;

    Department of Natural Science, Hankuk Academy of Foreign Studies, 50, Oedae-ro 54beon-gil, Mohyeon-myeon, Cheoin-gu, Yongin-si, Gyeonggi-do, Republic of Korea;

    Department of Computer Science and Engineering, Korea University, Seoul, Republic of Korea;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号