首页> 外文会议>International Conference on Computer Communication and Networks >PETS: Bottleneck-Aware Spark Tuning with Parameter Ensembles
【24h】

PETS: Bottleneck-Aware Spark Tuning with Parameter Ensembles

机译:PETS:具有参数集合的瓶颈感知火花调整

获取原文
获取外文期刊封面目录资料

摘要

Spark tuning with its dozens of parameters for performance improvement is both a challenge and time consuming effort. Current techniques rely on trial-and-error or best guess utilizing expert knowledge that very few posses. Previous tuning works are not compatible with Spark and also ignore the underlying problem of resource bottlenecks that is both the cause of performance issues, and a potential ally, if its awareness is leveraged in directing tuning to be more effective. We propose and develop PETS, a new method that allows the tuning of associated parameters at the same time, using resource bottleneck awareness to adjust parameter ensemble values in few iterations. Performance evaluation based on testbed implementation shows that with the use of PETS, representative workloads achieve: (1) Significant speedups; (2) Fast convergence speed; (3) Performance gains that are stable with varying workload data sizes, homogenous and heterogenous clusters, and initial parameter settings. The results show that PETS outperforms a machine learning based method, and achieves speedups of up to x4.78 and convergence speed as low as 2 iterations.
机译:带有数十个参数的Spark调整以提高性能既是挑战,也是费时的工作。当前的技术依靠反复试验或利用很少具备的专业知识的最佳猜测。以前的调优工作与​​Spark不兼容,并且如果利用其意识来指导调优更加有效,则它们会忽略资源瓶颈的根本问题,而资源瓶颈既是性能问题的原因,也是潜在的盟友。我们提出并开发了PETS,这是一种允许同时调整相关参数的新方法,它利用资源瓶颈意识在几次迭代中调整参数集合值。基于测试平台实施的性能评估表明,使用PETS可以使代表性的工作负载达到:(1)显着的加速; (2)收敛速度快; (3)在不同的工作负载数据大小,同质和异类集群以及初始参数设置的情况下,性能稳定增长。结果表明,PETS优于基于机器学习的方法,并实现了高达x4.78的加速比和低至2次迭代的收敛速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号