首页> 外文会议>IEEE International Conference on Networking, Architecture and Storage >Exploring Transfer Learning to Reduce Training Overhead of HPC Data in Machine Learning
【24h】

Exploring Transfer Learning to Reduce Training Overhead of HPC Data in Machine Learning

机译:探索转移学习,减少机器学习中HPC数据训练的训练

获取原文

摘要

Nowadays, scientific simulations on high-performance computing (HPC) systems can generate large amounts of data (in the scale of terabytes or petabytes) per run. When this huge amount of HPC data is processed by machine learning applications, the training overhead will be significant. Typically, the training process for a neural network can take several hours to complete, if not longer. When machine learning is applied to HPC scientific data, the training time can take several days or even weeks. Transfer learning, an optimization usually used to save training time or achieve better performance, has potential for reducing this large training overhead. In this paper, we apply transfer learning to a machine learning HPC application. We find that transfer learning can reduce training time without, in most cases, significantly increasing the error. This indicates transfer learning can be very useful for working with HPC datasets in machine learning applications.
机译:如今,高性能计算(HPC)系统的科学模拟可以每次运行产生大量数据(以Tberabytes或Tberabytes的规模)。当机器学习应用程序处理此大量HPC数据时,培训开销将是显着的。通常,神经网络的培训过程可能需要几个小时才能完成,如果不再。当机器学习应用于HPC科学数据时,培训时间可能需要几天甚至几周。转移学习,通常用于节省培训时间或实现更好性能的优化,具有减少这一大型训练的可能性。在本文中,我们将转移学习应用于机器学习HPC应用程序。我们发现转移学习可以减少培训时间,在大多数情况下,显着增加了错误。这表示传输学习对于使用机器学习应用程序中的HPC数据集非常有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号