Computer Science & Technology

Image Tampering Localization Based on Visual Multi-Scale Transformer

  • LU Lu ,
  • ZHONG Wen-Yu ,
  • WU Xiao-Kun
Expand
  • School of Computer Science and Engineering,South China University of Technology,Guangzhou 510640,Guangdong,China
陆璐 (1971-),男,教授,主要从事计算机视觉和软件质量保障研究

Received date: 2021-09-17

  Revised date: 2021-10-27

  Online published: 2021-11-08

Supported by

the National Social Science Foundation Key Project of China;the Major Program of the Zhongshan Industry-Academia-Research Fund

Abstract

With the continuous development of digital image processing technology, image tampering is no longer limited to a single method such as image splicing. The traces of malicious tampering are hidden through the post-processing of the image editing software, which leads to poor results of traditional image forgery detection algorithms and the tampering localization methods based on deep learning. Aiming at the problem of low accuracy of existing image tampering algorithms, an end-to-end image tampering location network based on Multi-Scale Visual Transformer is proposed. The network combines a transformer and a convolutional encoder to extract the feature difference between the tampered area and the non-tampered area. Multi-Scale Transformer models the spatial information of image block sequences of different sizes, so that the network can adapt to tampered areas of various shapes and sizes. Experimental results show that the F1 and AUC scores of the proposed algorithm in the CASIA and NIST2016 test sets are 0.431、0.877、0.728 and 0.971, respectively, which are significantly improved co- mpared to the existing mainstream algorithms. Moreover, the algorithm proposed in this paper is robust against JPEG compression attacks.

Cite this article

LU Lu , ZHONG Wen-Yu , WU Xiao-Kun . Image Tampering Localization Based on Visual Multi-Scale Transformer[J]. Journal of South China University of Technology(Natural Science), 2022 , 50(6) : 10 -18 . DOI: 10.12141/j.issn.1000-565X.210603

Outlines

/