In December, Motherboard discovered a Redditor named ‘deepfakes’ quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities — including Gal Gadot, Maisie Williams, and Taylor Swift — using a machine learning algorithm, his home computer, publicly available videos, and some spare time. Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. A redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.
An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.