Deep Fake Images of Taylor Swift Taken Down
Sexually explicit AI-generated images of Taylor Swift have been circulating on X Twitter) recently in the latest example of the proliferation of fake pornography.
The spread of AI-generated sexually explicit images of the internationally popular singer / songwriter Taylor Swift is bad enough and she is perhaps th e most well-known example of what is happening to women every day.
Now, US politicians are calling for new laws to criminalise the creation of deepfake images and videos, after the Taylor Swift images were posted on social media sites, including X and Telegram, attracting more than 45m views.
In a statement, X Twitter said it was "actively removing" the images and taking "appropriate actions" against the accounts involved in spreading them. It added: "We're closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed."While many of the images appear to have been removed at the time of publication, one photo of Swift was viewed a reported to have been viewed 47 million times before being taken down.
The name "Taylor Swift" is no longer searchable on X, alongside terms such as "Taylor Swift AI" and "Taylor AI"
Deepfakes use Artificial Intelligence (AI) to make a video of someone by manipulating their face or body. A study in 2023 found that there has been a 550% rise in the creation of doctored images since 2019, fuelled by the emergence of AI. The exploitation of generative AI tools to create potentially harmful content targeting all types of public figures is increasing quickly and spreading faster than ever across social media.
In the US no federal laws against the sharing or creation of deepfake images, though there have been moves at state level to tackle the issue. In the UK, the sharing of deepfake pornography became illegal as part of its Online Safety Act in 2023.
US congressman Joe Morelle described the fake pictures as “appalling”. “The spread of AI-generated explicit images of Taylor Swift is appalling and sadly, it’s happening to women everywhere, every day,” he said on X.
“It’s sexual exploitation, and I’m fighting to make it a federal crime with my legislation: the Preventing Deepfakes of Intimate Images Act.”
Today, 404Media are reporting that Microsoft has closed a loophole in its AI tool, Designer, which allowed users to generate AI nude images of celebrities. Microsoft's changes come after 404 Media first reported that the Swift images that went viral on Twitter came from 4chan and Telegram, where people used Designer to make nude images of Swift and other celebrities.
This is not going to stop flow of AI - generated deepfake images on the internet, but it is a very positive development in fighting back against this form of abuse.
404Media: Joe Morelle: ITV: BBC: Herald Scotland: WHTimes: MSN: The Verge: Vogue:
You Might Also Read:
Sharing Deepfakes To Be Made Illegal In Britain:
DIRECTORY OF SUPPLIERS - Deepfake & Disinformation Detection:
___________________________________________________________________________________________
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible