Nonconsensual sexually explicit deepfakes of Taylor Swift went viral on X on Wednesday, amassing over 27 million views and more than 260,000 likes in 19 hours before the account that posted the images was suspended.
Deepfakes portraying Swift nude and in sexual scenarios continue to proliferate on X, including reposts of the viral deepfake images. Such images can be generated with AI tools that develop entirely new, fake images, or they can be created by taking a real image and “undressing” it with AI tools.
The origin of the images isn’t clear, but a watermark on them indicates that they came from a years-old website that is known for publishing fake nude images of celebrities. The website has a portion of its website titled “AI deepfake.”
Reality Defender, an AI-detection software company, scanned the images and said that there was a high likelihood that they were created with AI technology.
The mass proliferation of the images for nearly a day shines a spotlight on the increasingly alarming spread of AI-generated content and misinformation online. Despite the escalation of the issue in recent months, tech platforms like X, which have developed their own generative-AI products, have yet to deploy or discuss tools to detect generative-AI content that goes against their guidelines.
Read the full story on NBCNews.com here.
Get Tri-state area news delivered to your inbox. Sign up for NBC New York's News Headlines newsletter.