Thanks for contacting us. We've received your submission.
Fake pornographic images of Taylor Swift generated using artificial intelligence are circulating on social media, leaving her loyal legion of Swifties wondering how there’s not more regulation around the nonconsensual creation of X-rated images.
The images in question — known as “deepfakes” — show Swift in various sexualized positions at a Kansas City Chiefs game, a nod to her highly publicized romance with the team’s tight end Travis Kelce.
It wasn’t immediately clear who created the images or first shared them to X, though as of Thursday morning, “Taylor Swift AI” was trending on the platform, with more than 58,000 posts on the topic.
Swifties came together and tried to bury the images by sharing an influx of positive posts about the 34-year-old songstress.
“How is this not considered sexual assault??” one X user asked. “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable how are there no regulations laws preventing this.”
“When i saw the taylor swift AI pictures, i couldn’t believe my eyes. Those AI pictures are disgusting,” another said.
Other outraged Swift fans called whoever created them “disgusting” and said instances like these “ruin the [AI] technology.”
“Whosoever released them deserves punishment,” yet another chimed in.
Swift’s publicist, Tree Paine, did not immediately respond to The Post’s request for comment.
President Biden signed an executive order to further regulate AI in October that prevents “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals,” among other things, including further oversight of the tech’s use in developing biological materials.
The order also demands that the federal government issue guidance “to watermark or otherwise label output from generative AI.”
Nonconsensual deepfake pornography has also been made illegal in Texas, Minnesota, New York, Hawaii and Georgia, though that hasn’t stopped the circulation of AI-generated nude images at high schools in New Jersey and Florida, where explicit deepfake images of female students were circulated by male classmates.
Last week, US Reps. Joseph Morelle (D-NY) and Tom Kean (R-NJ) reintroduced a bill that would make the nonconsensual sharing of digitally altered pornographic images a federal crime, with imposable penalties like jail time, a fine or both.
The “Preventing Deepfakes of Intimate Images Act” was referred to the House Committee on the Judiciary, but the committee has yet to make a decision on whether to pass the bill.
Aside from making the sharing of digitally altered intimate images a criminal offense, Morelle and Kean’s proposed legislation would allow victims to sue offenders in civil court.
In an example of how convincing this technology can be, several Swift fans were reportedly scammed out of hundreds of dollars earlier this month after tricksters released advertisements employing AI-generated video of the Grammy winner peddling Le Creuset in an attempt to steal money and data from fans.
The ads — which can be found across all social media platforms — show Swift, 34, standing next to the Le Creuset Dutch oven, which, according to the official website, runs anywhere from $180 to $750 depending on the size and style.
Last year, other deepfake images of Pope Francis in a Balenciaga puffer jacket and Donald Trump resisting arrest also took the internet by storm.
Advertisement