TikTok star Addison Rae is reportedly the latest victim of AI deepfake porn, a disturbing new way that public figures and celebrities are being sexualized and exploited against their will.
A fake NSFW video of Addison, 22, seemingly engaging in explicit content went viral after leaking from an unknown deepfake platform this week.
“This AI s–t is actually terrifying to witness. This isn’t Addison Rae, it’s a deepfake. Women are in danger of being sexually exploited with this new technology and it’s horrifying. We’re seeing a completely new way women’s safety is being targeted and there’s nothing to stop it,” one Twitter user wrote, quote-tweeting a since-deleted tweet that contained a clip from the fake video of Addison sitting on a bed in her underwear.
The original tweet that exposed the deepfake had over 21 million views and, perhaps more disturbingly, over 17,000 bookmarks before it was deleted Friday (June 2).
“Well you’re weird [and] disgusting [and] clearly don’t care about consent, congrats on the hit tweet [though, I guess]!” one person tweeted in reply to the since-removed video clip.
“Is this not like …. Sexual harassment?” another person tweeted.
“Women cannot exist online anymore without u freaks doing something humiliating about it,” someone else summed it up best.
READ MORE: Christians Slam Addison Rae for Wearing ‘Blasphemous’ Bikini
The original deepfake video reportedly includes an AI-generated visual of Addison’s head superimposed onto a real-life porn actor’s body.
On Twitter, NBC tech and culture reporter Kat Tenbarge noted that Addison previously expressed she wouldn’t be comfortable opening an account on OnlyFans, a platform widely known for hosting explicit content.
“The viral Addison Rae deepfake is one of dozens I’ve seen. They’re incredibly disturbing. Back in 2020 on a podcast, Addison said she wouldn’t be comfortable having an OnlyFans, but men on the internet did it anyways without her consent. It’s a violation,” Tenbarge wrote.
“Major 2020 influencers like Addison are some of the most popular women to be deepfaked, including ones who were underage girls when they became famous. That is the most popular kind of deepfake for young actresses/influencers — ones who are ‘barely legal,'” she continued in another tweet.
Tenbarge also wrote that such deepfake videos are violate the rights of “whichever sex worker(s) created the original videos.”
“The way these videos are made is by taking their original work and swapping the face to create a non-consensual product involving both women, which the deepfake creator can profit from,” she explained.