首页> 外文期刊>International Journal of Image and Graphics >Image Copy-Move Forgery Detection Using Combination of Scale-Invariant Feature Transform and Local Binary Pattern Features
【24h】

Image Copy-Move Forgery Detection Using Combination of Scale-Invariant Feature Transform and Local Binary Pattern Features

机译:Image Copy-Move Forgery Detection Using Combination of Scale-Invariant Feature Transform and Local Binary Pattern Features

获取原文
获取原文并翻译 | 示例
       

摘要

Today, manipulating, storing, and sending digital images are simple and easy because of the development of digital imaging devices from hardware and software points of view. Digital images are used in different contexts of people's lives such as news, forensics, and so on. Therefore, the reliability of received images is a question that often occupies the viewer's mind and the authenticity of digital images is increasingly important. Detecting a forged image as a genuine one as well as detecting a genuine image as a forged one can sometimes have irreparable consequences. For example, an image that is available from the scene of a crime can lead to a wrong decision if it is detected incorrectly. In this paper, we propose a combination method to improve the accuracy of copy-move forgery detection (CMFD) reducing the false positive rate (FPR) based on texture attributes. The proposed method uses a combination of the scale-invariant feature transform (SIFT) and local binary pattern (LBP). Consideration of texture features around the keypoints detected by the SIFT algorithm can be effective to reduce the incorrect matches and improve the accuracy of CMFD. In addition, to find more and better keypoints some pre-processing methods have been proposed. This study was evaluated on the COVERAGE, GRIP, and MICC-F220 databases. Experimental results show that the proposed method without clustering or segmentation and only with simple matching operations, has been able to earn the true positive rates of 98.75%, 95.45%, and 87% on the GRIP, MICC-F220, and COVERAGE datasets, respectively. Also, the proposed method, with FPRs from 17.75% to 3.75% on the GRIP dataset, has been able to achieve the best results.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号