Creators of Explicit Taylor Swift AI Images Used Pictures as Part of a ‘Game,’ ‘Challenge’ Before Getting Leaked

Click here to read the full article.

Fake AI images sexualizing Taylor Swift that went viral on social media last month were attributed to users on the 4chan message board who were competing in the exploitation of popular AI image generators.

Explicit deepfakes with Taylor Swift flooded X on Jan. 24, garnering more than 27 million views and  260,000 likes before the account that posted the images was suspended, 19 hours after the images first went up, NBC News reported.

According to the New York Times, researchers from Graphika, a firm that analyzes online manipulation tactics, have linked the pornographic Taylor Swift images to 4chan, an anonymous, adults-only message board.

Graphika noted that participants of this message board engage in daily challenges, exchanging strategies to evade filters of AI image generators.

“Some 4chan users expressed a stated goal of trying to defeat mainstream AI image generators’ safeguards rather than creating realistic sexual content with alternative open-source image generators,” Graphika reported, according to ARS Technica.

“They also shared multiple behavioral techniques to create image prompts, attempt to avoid bans, and successfully create sexually explicit celebrity images.”

The Times said users were instructed to use Microsoft tools such as Bing Image Creator and Microsoft Designer, along with OpenAI’s DALL-E.

Continue reading here.

Scroll down for comments and share your thoughts!

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Sony Makes It All But Official: The PS5 Is Slowly Riding Off into the Sunset

If You See These Strange Pink Eggs at the Waterside, Destroy Them Immediately