首页>
外国专利>
MACHINE LEARNING THROUGH PARALLELIZED STOCHASTIC GRADIENT DESCENT
MACHINE LEARNING THROUGH PARALLELIZED STOCHASTIC GRADIENT DESCENT
展开▼
机译:通过随机分布的梯度下降进行机器学习
展开▼
页面导航
摘要
著录项
相似文献
摘要
Systems, methods, and computer media for machine learning through a symbolic, parallelized stochastic gradient descent (SGD) analysis are provided. An initial data portion analyzer can be configured to perform, using a first processor, SGD analysis on an initial portion of a training dataset. Values for output model weights for the initial portion are initialized to concrete values. Local model builders can be configured to perform, using an additional processor for each local model builder, symbolic SGD analysis on an additional portion of the training dataset. The symbolic SGD analysis uses a symbolic representation as an initial state for output model weights for the corresponding portions of the training dataset. The symbolic representation allows the SGD analysis and symbolic SGD analysis to be performed in parallel. A global model builder can be configured to combine outputs of the local model builders and the initial data portion analyzer into a global model.
展开▼