首页> 外文期刊>SIAM Journal on Scientific Computing >MULTILEVEL ARTIFICIAL NEURAL NETWORK TRAINING FOR SPATIALLY CORRELATED LEARNING
【24h】

MULTILEVEL ARTIFICIAL NEURAL NETWORK TRAINING FOR SPATIALLY CORRELATED LEARNING

机译:用于空间相关学习的多级人工神经网络培训

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Multigrid modeling algorithms are a technique used to accelerate iterative method models running on a hierarchy of similar graphlike structures. We introduce and demonstrate a new method for training neural networks which uses multilevel methods. Using an objective function derived from a graph-distance metric, we perform orthogonally-constrained optimization to find optimal prolongation and restriction maps between graphs. We compare and contrast several methods for performing this numerical optimization, and additionally present some new theoretical results on upper bounds of this type of objective function. Once calculated, these optimal maps between graphs form the core of multiscale artificial neural network (MsANN) training, a new procedure we present which simultaneously trains a hierarchy of neural network models of varying spatial resolution. Parameter information is passed between members of this hierarchy according to standard coarsening and refinement schedules from the multiscale modeling literature. In our machine learning experiments, these models are able to learn faster than training at the fine scale alone, achieving a comparable level of error with fewer weight updates (by an order of magnitude).
机译:多版本建模算法是用于加速在类似图形结构的层次结构上运行的迭代方法模型的技术。我们介绍并展示了一种培训使用多级方法的神经网络的新方法。使用从图形距离度量导出的目标函数,我们执行正交受限的优化,以找到图形之间的最佳延长和限制映射。我们比较和对比进行几种执行该数值优化的方法,并且另外在这种类型的客观函数的上限上呈现一些新的理论结果。一旦计算出来,图形之间的这些最佳地图形成了多尺度人工神经网络(MSANN)训练的核心,我们存在的新程序,同时列举不同空间分辨率的神经网络模型的层次结构。根据来自多尺度建模文献的标准粗化和细化计划,参数信息在本层次结构的成员之间传递。在我们的机器学习实验中,这些模型能够比单独的精细规模训练更快地学习,实现比较重量更新的相当误差水平(按幅度顺序)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号