- One of the most viral explicit posts depicting Swift received more than 45 million views, the Verge reported, before X removed it. The images probably surfaced in a Telegram channel that produces similar images, according to 404 Media.
- In an effort to drown out searches for the explicit images on the social media platforms, Swift’s fans posted widely with the phrase, “Protect Taylor Swift.”
- Representatives for Swift and X, formerly known as Twitter, did not immediately respond to The Washington Post’s requests for comment Friday morning.
Deepfakes are lifelike fake videos or images created with face- and audio-swapping technology. They often go viral on social platforms and have improved at replicating a person’s voice. That’s happened as the tools to identify an AI-made image have struggled to keep up, making it more difficult for those platforms to identify problematic videos.
Celebrities have warned followers not to be duped by the deepfakes.
Easy access to AI imaging technology has created a new tool to target women, allowing almost anyone to produce and share nude images of them.
Ahead of the 2024 presidential election, AI is also giving politicians excuses to dismiss potentially damaging pieces of evidence as fakes generated by AI. That’s happening at the same time as real AI deepfakes are being used to spread misinformation.
What are governments doing about AI and deepfakes?
States are trying to lead the way on guardrails against AI, with some enacting measures to protect against the use of deepfakes in elections.
In Congress, Rep. Joseph Morelle (D-N.Y.) has introduced a bill called the Preventing Deepfakes of Intimate Images Act, which he says would make creating those types of videos a federal crime.
“The spread of AI-generated explicit images of Taylor Swift is appalling—and sadly, it’s happening to women everywhere, every day,” Morelle wrote Thursday on X.
The images of Swift spread widely on X, which dismantled much of its moderation shortly after Elon Musk took over the platform.
In a statement released overnight, X said it had removed the Swift images, though it did not identify her.
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” the statement said. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed. We’re committed to maintaining a safe and respectful environment for all users.”
In response to The Washington Post’s email requesting comment Friday morning, an automated message said: “Busy now, please check back later.”
Samantha Chery contributed to this report.
#Nude #Taylor #Swift #deepfakes #built #spread #online #week