首页> 外文会议>IEEE International Congress on Big Data >SD-HDFS: Secure Deletion in Hadoop Distributed File System
【24h】

SD-HDFS: Secure Deletion in Hadoop Distributed File System

机译:SD-HDFS:Hadoop分布式文件系统中的安全删除

获取原文

摘要

Sensitive information that is stored in Hadoop clusters can potentially be retrieved without permission or access granted. In addition, the ability to recover deleted data from Hadoop clusters represents a major security threat. Hadoop clusters are used to manage large amounts of data both within and outside of organizations. As a result, it has become important to be able to locate and remove data effectively and efficiently. In this paper, we propose Secure Delete, a holistic framework that propagates file information to the block management layer via an auxiliary communication path. The framework tracks down undeleted data blocks and modifies the normal deletion operation in the Hadoop Distributed File System (HDFS). We introduce CheckerNode, which generates a summary report from all DataNodes and compares the block information with the metadata from the NameNode. If the metadata do not contain the entries for the data blocks, unsynchronized blocks are automatically deleted. However, deleted data could still be recovered using digital forensics tools. We also describe a novel secure deletion technique in HDFS that generates a random pattern and writes multiple times to the disk location of the data block.
机译:未经许可或授予访问权限,就有可能检索存储在Hadoop集群中的敏感信息。此外,从Hadoop群集中恢复已删除数据的能力构成了主要的安全威胁。 Hadoop集群用于管理组织内外的大量数据。结果,重要的是能够有效地定位和删除数据。在本文中,我们提出了安全删除(Secure Delete),这是一个通过辅助通信路径将文件信息传播到块管理层的整体框架。该框架跟踪未删除的数据块,并修改了Hadoop分布式文件系统(HDFS)中的正常删除操作。我们介绍CheckerNode,它可以从所有DataNode生成摘要报告,并将块信息与NameNode的元数据进行比较。如果元数据不包含数据块的条目,则未同步的块将被自动删除。但是,仍然可以使用数字取证工具来恢复已删除的数据。我们还描述了HDFS中的一种新颖的安全删除技术,该技术可生成随机模式并多次写入数据块的磁盘位置。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号