There are all kinds of reasons to worry about deepfake technology and the impact that it will have by making it possible to digitally manipulate faces to say or do anything you want. But that same face-swapping tech could be used for good as well — as a new tech showcase from researchers at the Norwegian University of Science and Technology makes clear.
They’ve developed a Generative Adversarial Network (GAN) called DepPrivacy that’s designed for face anonymization. Essentially, the idea is that it uses deepfake capabilities to create a new real-time face that could be used to keep your identity a secret online. The difference between this and, say, pixelating your face is that it can match the user’s real expression, while also seamlessly joining the fake face with the background. In other words, users could still interact through video chats; they just wouldn’t look like themselves.
“We propose a novel architecture which is able to automatically anonymize faces in images while retaining the original data distribution,” the researchers write in an abstract for their paper, which is published on the arXiv preprint server. “We ensure total anonymization of all faces in an image by generating images exclusively on privacy-safe information. Our model is based on a conditional generative adversarial network, generating images considering the original pose and image background. The conditional information enables us to generate highly realistic faces with a seamless transition between the generated face and the existing background.”
As far as the researchers know, no other solution has been proposed that guarantees the anonymization of faces while generating realistic images.
The technology works by first extracting information to find a person’s facial expression based on the position of their different facial features. Using a GAN, which has been trained on a massive dataset of 1.47 million face images, it then creates a wholly new face dreamed up by the algorithm. (Although it’s worth noting that the faces generated don’t extend to ears or the outer parts of the face, which could still be used for identification.)
While such a tool could potentially be used by bad actors as well as good, it could nonetheless be a potentially useful tool for whistleblowers and others. The next question is how you get it out to those who need it.