首页> 中文期刊>中国通信 >A Distributed Computing Framework Based on Lightweight Variance Reduction Method to Accelerate Machine Learning Training on Blockchain

A Distributed Computing Framework Based on Lightweight Variance Reduction Method to Accelerate Machine Learning Training on Blockchain

     

摘要

To security support large-scale intelligent applications,distributed machine learning based on blockchain is an intuitive solution scheme.However,the distributed machine learning is difficult to train due to that the corresponding optimization solver algorithms converge slowly,which highly demand on computing and memory resources.To overcome the challenges,we propose a distributed computing framework for L-BFGS optimization algorithm based on variance reduction method,which is a lightweight,few additional cost and parallelized scheme for the model training process.To validate the claims,we have conducted several experiments on multiple classical datasets.Results show that our proposed computing framework can steadily accelerate the training process of solver in either local mode or distributed mode.

著录项

  • 来源
    《中国通信》|2020年第9期|77-89|共13页
  • 作者单位

    Science and Technology on Parallel and Distributed Laboratory National University of Defense Technology Changsha 410000 China;

    Science and Technology on Parallel and Distributed Laboratory National University of Defense Technology Changsha 410000 China;

    Science and Technology on Parallel and Distributed Laboratory National University of Defense Technology Changsha 410000 China;

    H.R. Support Center PLA Beijing 100000 China;

    Science and Technology on Parallel and Distributed Laboratory National University of Defense Technology Changsha 410000 China;

  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

  • 入库时间 2023-07-25 20:36:40

相似文献

  • 中文文献
  • 外文文献
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号