In today's world, pornography has taken on a new face. Today, porn websites are able to feature TikTok stars like Addison Rae and Dixie D'Amelio — despite the fact that neither of them have actively done or distributed pornography. So, why do you see their faces in pornos? Well, that's thanks to technology that has brought us deepfakes.
Essentially, deep fakes are when someone is made to do something or say something that they never really said or did. Deepfakes allow someone to watch Dixie or Addison or anyone engage in sexual activity — simply because their heads were put on the bodies of someone who actually did do the engaging.
Dixie D'Amelio is just one of many victims of deepfakes.
Although the concept of deepfakes are not new, the use of TikTokers in deepfake pornography relatively is, and both Addison Rae and Dixie D'Amelio recently addressed dealing with them. Celebrities have been dealing with their images being used for pornography for quite some time, and it has only gotten worse as technology has advanced to make such videos incredibly realistic.
According to deepfake detection company Sensity, up to 1,000 deepfake videos get uploaded to porn sites every month, with the number having gone up increasingly throughout 2020. While there used to be dedicated deepfake porn sites, such videos have become so common and mainstream that they can be found on major sites.
The second biggest porn website, Pornhub, even went so far as to ban deepfakes, and yet it still has a major problem with them. As Pornhub shares annual year-in-reviews, they even let the public know what celebrities people are looking for. Although some celebrities who actually do share sexual content were on the list, a lot more were celebrities who were victims of multiple deepfake videos.
In 2019, Ariana Grande was searched for 9 million times on Pornhub — which immediately led the searcher to deepfake videos. Although Pornhub did technically ban deepfakes in that they banned channels associated with them and they banned the word itself, it's not very difficult for any uploader to go around that and use different keywords, for instance keywords related to specific celebrities.
Essentially, Pornhub's year-in-review managed to serve as a reminder that they aren't policing their own policy of not allowing deepfakes. This highlights an even greater issue that many have had with such porn sites in general — the lack of care taken for non-consensual videos.
When you're looking at a deepfake, you are looking at non-consensual pornography.
Even when we're discussing celebrities on Pornhub and similar sites, we're often talking about things like leaked sex tapes — even more examples of non-consensual videos. The issue of private videos being uploaded and distributed to the masses is not a new issue. Deepfakes has just complicated a problem that was already not under control.
It's often said that once something is online, it stays there forever. This is especially true for sensitive content. When one tries to have a video removed from a porn site, even if they succeed, more often than not, it'll just pop back up again, if it hasn't already been uploaded multiple times. There are entire sections of porn sites dedicated to "secret recordings," as well as "snapchat leaks."
When one's image is distributed online without their permission, it is non-consensual. This includes deepfake videos. Those are images of Addison Rae and Dixie D'Amelio, two TikTokers who were barely of age as they began their careers. They've just been doctored into something that they never consented to do.
Although there is some legal action that the young women can take, their options are extremely limited. One of the biggest issues is that deepfakes are unprecedented, and the U.S. legal system doesn't quite know how to deal with them yet. Although there are arguments to be made that such videos are violations of copyright or defamations of character, both are difficult to prove in a court of law.
Moreover, the videos may be protected by the First Amendment, so it would be on the plaintiff to prove that their claims outweigh First Amendment protections. Even more of an issue, such lawsuits would have to be taken up with the original producer of the film, not the host. Section 230 of the Communications Decency Act says that websites are not liable for any third-party content, effectively making porn sites like xHamster and Pornhub not responsible for any deepfakes found on their sites.
And finding those original producers is not simple. Even more problematic, if that person is overseas, there's even further complications to taking legal action.
Although we've been warned that deepfakes are going to affect politics, the reality is that 96 percent of discovered deepfakes on the internet in 2018 were porn of women's images being used without their consent. And with the number of deepfakes being increased year after year, and with porn sites having very little in place to protect against non-consensual videos, it's likely that this is a reality that Dixie D'Amelio, Addison Rae, and many other young stars will be forced to live in.