首页> 外文会议>International Conference on Internet of Things, Big Data and Security >A LAHC-based Job Scheduling Strategy to Improve Big Data Processing in Geo-distributed Contexts
【24h】

A LAHC-based Job Scheduling Strategy to Improve Big Data Processing in Geo-distributed Contexts

机译:基于LAHC的作业调度策略,以提高地理分布式上下文中的大数据处理

获取原文

摘要

The wide spread adoption of IoT technologies has resulted in generation of huge amount of data, or Big Data, which has to be collected, stored and processed through new techniques to produce value in the best possible way. Distributed computing frameworks such as Hadoop, based on the MapReduce paradigm, have been used to process such amounts of data by exploiting the computing power of many cluster nodes. Unfortunately, in many real big data applications the data to be processed reside in various computationally heterogeneous data centers distributed in different locations. In this context the Hadoop performance collapses dramatically. To face this issue, we developed a Hierarchical Hadoop Framework (H2F) capable of scheduling and distributing tasks among geographically distant clusters in a way that minimizes the overall jobs execution time. In this work the focus is put on the definition of a job scheduling system based on a one-point iterative search algorithm that increases the framework scalability while guaranteeing good job performance.
机译:IOT技术的广泛展开采用导致了一代大量的数据,或大数据,必须通过新技术来收集,存储和处理以以最佳方式产生价值。基于MapReduce范例的分布式计算框架(如Hadoop)已被用于通过利用许多群集节点的计算能力来处理此类数据量。遗憾的是,在许多真正的大数据应用中,要处理的数据驻留在分布在不同位置的各种计算异构数据中心。在此上下文中,Hadoop性能急剧缩小。要面对这个问题,我们开发了一种分层Hadoop框架(H2F),其能够以最小化整体作业执行时间的方式在地理上远处群集中调度和分发任务。在这项工作中,重点是基于一点迭代搜索算法的作业调度系统的定义,这增加了框架可扩展性,同时保证了良好的工作性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号