【24h】

Singularity Size Optimization in Data Deduplication Technique

机译:重复数据删除技术中奇异大小的优化

获取原文

摘要

Recently, massive data growth and data duplication in enterprise systems have led to the use of deduplication techniques. Since we keep multiple versions of files, there may be a large volume of mostly or exactly identical data. Deduplication is a powerful storage optimization technique that can be adopted to manage maintenance issues in data growth. We evaluated the effect of deduplication by analyzing how the singularity size affects the effect of deduplication for variable-length blocks. We clarify that the effect of deduplication is affected by the singularity size and the number of created blocks. We traced the change in the deduplication rate, which indicates the reduce ratio of file data volume, by changing the singularity size from 4 bits to 23 bits. The result shows that the optimum singularity size is 15 bits and that the deduplication rate is improved around 7 % at the optimum singularity size compared with at smaller or larger singularity size.
机译:最近,企业系统中的大规模数据增长和数据复制导致重复数据删除技术的使用。由于我们保留文件的多个版本,因此可能存在大量的大部分或完全相同的数据。重复数据删除是一项功能强大的存储优化技术,可用于管理数据增长中的维护问题。我们通过分析奇异大小如何影响可变长度块的重复数据删除效果来评估重复数据删除的效果。我们阐明重复数据删除的效果受奇异性大小和创建的块数影响。我们通过将奇异大小从4位更改为23位,跟踪了重复数据删除率的变化,该变化表明文件数据量的减少率。结果表明,最佳奇点大小为15位,并且与较小或较大奇点大小相比,在最佳奇点大小下,重复数据删除率提高了7%左右。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号