Social media company X restored users’ ability to search Taylor Swift’s name on the website on Monday evening, after being forced to implement a temporary pause over the weekend after the site was flooded with AI-generated explicit images of the pop star, raising questions about content moderation on the platform.
Taylor Swift walks off the field following the AFC Championship between the Kansas City Chiefs and … [+]
The search for the term “Taylor Swift” now yields tweets featuring the pop megastar’s name, instead of the “Something went wrong” error message users were presented with over the weekend.
The platform’s decision to block searches for Swift’s name came after AI-generated pornographic images of the singer began spreading on X, drawing anger from her fans.
X told various news outlets that searches for Swift’s name have been re-enabled, although the company “will continue to be vigilant for any attempt to spread this content and will remove it if we find it.”
Several dozen AI-generated images of Swift that were either sexually explicit or depicted violence against the pop star were shared widely by many accounts last week, with some receiving more than 45 million views and hundreds and thousands of likes. The images caused the phrase “Taylor Swift AI” to trend on the platform at one point last week. Legions of the singer’s fans, who call themselves “Swifties,” tried to fight back by mass reporting many of these posts and sharing more positive images of Swift with the hashtag #ProtectTaylorSwift. X responded to the controversy at first by saying sharing of non-consensual nudity was “strictly prohibited” and the company was trying to take down these images actively and “taking appropriate actions against the accounts responsible for posting them.” A day later, the company outright suspended searches for the singer’s name.
A report by 404 Media traced the likely origin of the explicit images to Telegram groups, at least one of whom used “Microsoft Designer,” a text-to-image AI generator targeted at graphic designers. Microsoft’s CEO told NBC News the images were “alarming and terrible,” and said the company was working towards placing guardrails around generative AI technology to prevent abuse. The company then said it had fixed a loophole that allowed Designer to be used to generate such images.
The incident comes amid a legislative push by some U.S. lawmakers to criminalize the sharing of non-consensual AI-generated pornography. Last week, White House press secretary Karine Jean-Pierre said the spread of these images was “alarming” and urged Congress to “take legislative action.” Jean-Pierre also said social media companies had to enforce their own rules, adding: “We know that lax enforcement disproportionately impacts women and also girls…who are the overwhelming targets of online harassment and also abuse.”
Taylor Swift Search Re-enabled on X Following AI Nude (The Hollywood Reporter)
Deepfake explicit images of Taylor Swift spread on social media. Her fans are fighting back (Associated Press)
AI-Generated Taylor Swift Porn Went Viral on Twitter. Here’s How It Got There (404 Media)
Consent Is Core To Porn. Performers Say AI Threatens To Take It Away. (Forbes)