首页> 外文会议>IEEE Conference on Computer Communications >A Flexible Distributed Optimization Framework for Service of Concurrent Tasks in Processing Networks
【24h】

A Flexible Distributed Optimization Framework for Service of Concurrent Tasks in Processing Networks

机译:用于处理网络中并发任务的灵活的分布式优化框架

获取原文

摘要

Distributed optimization has important applications in the practical implementation of machine learning and signal processing setup by providing means to allow interconnected network of processors to work towards the optimization of a global objective with intermittent communication. Existing works on distributed optimization predominantly assume all the processors storing related data to perform updates for the optimization task in each iteration. However, such optimization processes are typically executed at shared computing/data centers along with other concurrent tasks. Therefore, it is necessary to develop efficient distributed optimization methods that possess the flexibility to share the computing resources with other ongoing tasks. In this work, we propose a new first-order framework that allows for this flexibility through a probabilistic computing resource allocation strategy while guaranteeing the satisfactory performance of distributed optimization. Our results, both analytical and numerical, show that by controlling a flexibility parameter, our suite of algorithms (designed for various scenarios) can achieve the lower computation and communication costs of distributed optimization than their inflexible counterparts. This framework also enables the fair sharing of the common resources with other concurrent tasks being processed by the processing network.
机译:分布式优化通过提供允许处理器的互连网络朝着具有间歇性通信的全局目标进行优化的手段,在机器学习和信号处理设置的实际实现中具有重要的应用。现有的有关分布式优化的工作主要假设所有存储相关数据的处理器在每次迭代中执行优化任务的更新。但是,此类优化过程通常与其他并发任务一起在共享计算/数据中心执行。因此,有必要开发一种高效的分布式优化方法,该方法具有与其他正在进行的任务共享计算资源的灵活性。在这项工作中,我们提出了一个新的一阶框架,该框架通过概率计算资源分配策略提供了这种灵活性,同时保证了分布式优化的令人满意的性能。我们的分析结果和数值结果均表明,通过控制灵活性参数,我们的算法套件(针对各种情况而设计)比不灵活的算法可实现更低的分布式优化计算和通信成本。该框架还使得能够与处理网络正在处理的其他并发任务公平共享公共资源。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号