首页> 外文会议>IEEE Conference on Computer Communications >A Flexible Distributed Optimization Framework for Service of Concurrent Tasks in Processing Networks
【24h】

A Flexible Distributed Optimization Framework for Service of Concurrent Tasks in Processing Networks

机译:一种灵活的分布式优化框架,用于处理网络中的并发任务

获取原文

摘要

Distributed optimization has important applications in the practical implementation of machine learning and signal processing setup by providing means to allow interconnected network of processors to work towards the optimization of a global objective with intermittent communication. Existing works on distributed optimization predominantly assume all the processors storing related data to perform updates for the optimization task in each iteration. However, such optimization processes are typically executed at shared computing/data centers along with other concurrent tasks. Therefore, it is necessary to develop efficient distributed optimization methods that possess the flexibility to share the computing resources with other ongoing tasks. In this work, we propose a new first-order framework that allows for this flexibility through a probabilistic computing resource allocation strategy while guaranteeing the satisfactory performance of distributed optimization. Our results, both analytical and numerical, show that by controlling a flexibility parameter, our suite of algorithms (designed for various scenarios) can achieve the lower computation and communication costs of distributed optimization than their inflexible counterparts. This framework also enables the fair sharing of the common resources with other concurrent tasks being processed by the processing network.
机译:分布式优化在机器学习的实际实施中具有重要应用,并通过提供允许互连的处理器网络以跨越间歇性通信来优化全球目标的方法来实现重要应用。分布式优化的现有工作主要假设存储相关数据的所有处理器,以在每次迭代中执行优化任务的更新。然而,这种优化过程通常在共享的计算/数据中心以及其他并发任务中执行。因此,有必要开发有效的分布式优化方法,这些方法具有与其他正在进行的任务共享计算资源的灵活性。在这项工作中,我们提出了一种新的一阶框架,可以通过概率计算资源分配策略来实现这种灵活性,同时保证了分布式优化的令人满意的性能。我们的结果,分析和数值,表明,通过控制灵活性参数,我们的算法套件(为各种场景设计)可以实现比其不灵活的对应物的分布式优化的较低计算和通信成本。此框架还可以通过处理网络处理的其他并发任务来实现公共资源的公平共享。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号