Dozens of websites and platforms have banned a new category of pornography which replaces the real actor or actress and replaces it with other people’s face. This type of pornography is known as deepfakes, and it typically swaps faces with actors, pop singers or any type of personality.
Twitter has been the latest platform to ban this type of pornography. Twitter allows pornography in its network, but it has been said that deepfakes infringe the rules, since it displays intimate situations without consent.
The adult webpage PornHub was another major platform to publicly ban deepfake videos. Until recently, it had been an important source of traffic, but it seems that, by siding with Twitter, both sites aim to make clear that fake pornography has serious consequences.
Twitter was quick on taking actions. It only took 6 hours for the company to detect deepfakes being publicly shared on Reddit forums. They told to Motherboard that they “will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent”.
How are Deepfake videos created
Deepfakes are created using a program called FakeApp, which provides users with artificial intelligence tools and simplified mechanisms that enable them to swap faces in a video file. The main issue is that a user can, in fact, set an AI to generate non-consensual pornography by picking up on the faces of celebrities or just anyone for that matter.
The origin can be traced back to a Reddit user that went by the name “deepfakes” who started out by editing Gal Gadot’s face onto a pornstar’s body. Motherboard banned him but he soon created a subreddit that grew remarkably fast on its amount of subscribers.
Another user followed him by creating a “user-friendly system” which employed AI to allow just about anyone without computer knowledge to create fake porn by swapping celebrities faces. This was the birth of FakeApp. Soon, these videos spread all over porn sites, until most of them were banned by the platforms.
3 of the creepiest things about 'deepfake' video https://t.co/7ocZPfyhqt by @fionajmcevoy
— VentureBeat (@VentureBeat) February 8, 2018
Young female celebrities are the most popular deepfake targets
Among the reported actresses found in pornographic deepfake videos there were mostly popular TV and film young women. Emma Watson, Jennifer Lawrence, Sophie Turner and Ariana Grande were primary targets for these activities.
Not all deepfake videos have been related to porn, though. One of the most popular videos featured U.S. president Donald Trump as Dr.Evil, and Nicolas Cage stuck on the face of Yoda. These laidback comedic purposes seem to be well received, but the tools employed to make them have outraged the network platforms of some celebrities.
Source: The Verge