Recoverable facial identity protection via adaptive makeup transfer adversarial attacks
Unauthorised face recognition (FR) systems have posed significant threats to digital identity and privacy protection. To alleviate the risk of compromised identities, recent makeup transfer-based attack methods embed adversarial signals in order to confuse unauthorised FR systems. However, their major weakness is that they set up a fixed image unrelated to both the protected and the makeup reference images as the confusion identity, which in turn has a negative impact on both attack success rate and visual quality of transferred photos. In addition, the generated images cannot be recognised by authorised FR systems once attacks are triggered. To ad- dress these challenges, in this paper, we propose a Recoverable Makeup Transferred Generative Adversarial Network (RMT-GAN) which has the distinctive feature of improving its image-transfer quality by selecting a suitable transfer reference photo as the target identity. Moreover, our method offers a solution to recover the protected photos to their original counterparts that can be recognised by authorised systems. Experimental results demonstrate that our method provides significantly improved attack success rates while maintaining higher visual quality compared to state-of-the-art makeup transfer-based adversarial attack methods.
History
School
- Science
Department
- Computer Science
Published in
Proceedings of the 39th AAAI Conference on Artificial IntelligenceSource
The 39th Annual AAAI Conference on Artificial IntelligencePublisher
Association for the Advancement of Artificial IntelligenceVersion
- AM (Accepted Manuscript)
Rights holder
© Association for the Advancement of Artificial IntelligencePublisher statement
This is a conference paper presented at the 39th Annual AAAI Conference on Artificial Intelligence. It is due to be published openly © Association for the Advancement of Artificial Intelligence. All Rights Reserved.Acceptance date
2024-12-10Copyright date
2025ISSN
2374-3468eISSN
2159-5399Publisher version
Language
- en