It was only a matter of time. Users on Deepfakes, a subreddit where people use deep learning technology to face swap celebrities’ likenesses onto footage of porn performers, are turning their skills to a new purpose: inserting the much-memed actor Nicolas Cage into roles he never played.
Back in December, Motherboard discovered a Reddit user who was using a neural network to swap the faces of celebrities including Gal Godot and Taylor Swift onto the bodies of porn performers. Just weeks later, another Redditor used the same algorithm to create a user-friendly program called FakeApp which streamlines the process, letting users plug in a database of images to create new videos without much technical know-how about deep learning technologies. In the wake of FakeApp, Motherboard reported, the phenomenon exploded — the Deepfakes subreddit, for instance, now has more than 30,000 followers.
And now, in what seems to be the natural progression of things on the internet, the subreddit is turnings attention to Cage, an actor known for leading roles in “Raising Arizona,” “Lord of War” and “Face/Off.” In spite of — or perhaps because of — Cage’s uneven critical reception, he’s also become a timeless internet meme. Internet humorists have long remixed his strange performance in the 2006 remake of “The Wicker Man,” photoshopped him into surreal situations, and, most of all, plastered his face onto other people.
Now, on Reddit, users are using Deepfakes to bring that trend to the moving picture. One clip swaps Cage onto the body of Andy Samberg on the set of Saturday Night Live, seated next to the real Cage. Another pastes his face onto the body of Stannis Baratheon, a Game of Thrones character played by Stephen Dillane. Yet another casts him onto the visage of Sean Connery in Dr. No.
Another clip reaches even further into the meta by swapping Cage in for a character in Raiders of the Lost Ark, which on the subreddit was titled “Nic Cage // National Treasure 3 Edition” — a sequel to a Cage franchise that has not yet come out.
The Cage clips are fun, but they’re also a distraction from a technology with far-reaching implications. Gluing the likeness of a celebrity onto a smutty film is likely a preview of things to come. It could lead to a new era not just of faked porn but even doctored footage of public figures — and, probably, a new excuse when authentically damaging footage like Trump’s infamous Access Hollywood tape comes to light.
“To those who condemn the practices of this community, we sympathize with you,” wrote a redditor named Gravity_Horse in response to media attention on the subreddit. “What we do here isn't wholesome or honorable, it's derogatory, vulgar, and blindsiding to the women that deepfakes works on. That said, the work that we create here in this community is not with malicious intent. Quite the opposite. We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design.”
Just a day before the Cage clips started to appear, a commenter wondered how long it would be until someone created an entire Cage film with the face-swap technology.
“That's actually a very very good idea,” one user replied. “Safe for work and media outlets would love to write about this if done well.”