首页> 外国专利> MACHINE LEARNING THROUGH PARALLELIZED STOCHASTIC GRADIENT DESCENT

MACHINE LEARNING THROUGH PARALLELIZED STOCHASTIC GRADIENT DESCENT

机译:通过随机分布的梯度下降进行机器学习

摘要

Systems, methods, and computer media for machine learning through a symbolic, parallelized stochastic gradient descent (SGD) analysis are provided. An initial data portion analyzer can be configured to perform, using a first processor, SGD analysis on an initial portion of a training dataset. Values for output model weights for the initial portion are initialized to concrete values. Local model builders can be configured to perform, using an additional processor for each local model builder, symbolic SGD analysis on an additional portion of the training dataset. The symbolic SGD analysis uses a symbolic representation as an initial state for output model weights for the corresponding portions of the training dataset. The symbolic representation allows the SGD analysis and symbolic SGD analysis to be performed in parallel. A global model builder can be configured to combine outputs of the local model builders and the initial data portion analyzer into a global model.
机译:提供了用于通过符号化,并行化的随机梯度下降(SGD)分析进行机器学习的系统,方法和计算机介质。初始数据部分分析器可以配置为使用第一处理器对训练数据集的初始部分执行SGD分析。初始部分的输出模型权重值将初始化为具体值。可以将本地模型构建器配置为对每个本地模型构建器使用附加处理器,对训练数据集的附加部分执行符号SGD分析。符号SGD分析使用符号表示作为训练数据集相应部分的输出模型权重的初始状态。通过符号表示,可以并行执行SGD分析和SGD符号分析。可以将全局模型构建器配置为将本地模型构建器和初始数据部分分析器的输出组合为全局模型。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号