【24h】

Research on big data processing technology based on Hadoop

机译:基于Hadoop的大数据处理技术研究

获取原文

摘要

In the era of big data, the traditional data storage and computing mode can't meet the growing demand of mass data, and the emergence of Hadoop technology is a good solution to this problem. Hadoop is a software framework for distributed processing of large amounts of data, the core components including HDFS and Map/Reduce, and has the advantages of high reliability, high scalability, high efficiency and high fault tolerance. Now Hadoop has been used by many enterprises to build their own data processing platform. Based on the studies of some basic Hadoop big data processing cases, this paper cards a Hadoop-based big data processing architecture, and specifically optimizes the issues of HDFS for storing small files.
机译:在大数据的时代,传统的数据存储和计算模式无法满足越来越多的质量数据需求,而Hadoop技术的出现是对此问题的良好解决方案。 Hadoop是一种用于大量数据的分布式处理的软件框架,核心组件包括HDFS和地图/减少,具有高可靠性,高缩放性,高效率和高容错容差的优点。现在,许多企业已经使用了Hadoop来构建自己的数据处理平台。基于对一些基本Hadoop大数据处理案例的研究,本文卡片基于Hadoop的大数据处理架构,并专门优化了用于存储小文件的HDF的问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号