We are reaching a point where it is almost impossible to trust what our eyes see on the internet, either because of the famous deepfakes or because of the manipulation of images and videos. Therefore, Adobe is showing the first glimpse of a tool based on artificial intelligence with which, they say, can determine when a face has been manipulated with Photoshop and whose goal is to “restore confidence in digital media.”
Adobe has been experimenting with new tools using artificial intelligence for some years. In fact, last year they said they were developing software to detect images manipulated with Photoshop. Today we see the first attempt at this, but for the moment focused on faces.
Not only it will detect when it has been manipulated, it will also revert to its original state
According to the information, a team of researchers from Adobe and the University of California at Berkeley managed to train a convolutional neuronal network (CNN) on a series of images that showed faces before and after being digitally manipulated.
With this, the neural network was in a position to detect changes in the images manipulated with the ‘Face Away Liquify’ function of Photoshop, which allows changing details in the eyes, mouth and other facial features of people in a relatively simple way.
Once having this, the researchers tested the detection capabilities of their tool compared to humans. This is how a group of people was shown two images at a time where they had to identify which of the two had been edited and was not real.
In these tests, humans only guessed 53% of the time, while the neural network was able to detect the manipulated image with 99% accuracy. In addition, the neural network also pointed to the parts where the edition had been made and could even revert it to its original state.
According to the research, the results are extremely promising but at the moment they are in a stage of development. Therefore, there is still a long way to go, as the idea is that not only based on images edited in Photoshop but also those that have been manipulated with other tools or software.
Richard Zhang, an Adobe researcher, mentioned:
“This is just the beginning, it is a job that is urgently needed because we live in a world where it is increasingly difficult to trust the digital information we consume.”