What Taylor Swift’s Deepfake Porn Nightmare Says About the Growing Threat of AI

taylor swift in front of a pixelated screen

Getty Images

And how she could combat it.

Picture this scenario: We’re strangers who’ve just met for a job interview, blind date, or some other situation where first impressions matter, and you can’t get this one worry out of your head: Was I one of the thousands of people who’d seen you having sex with a stranger? Because on video sites popping up all over the internet, your features, your image, your facial expressions, and your likeness are all superimposed on another woman’s body. It doesn’t matter that technically, the woman in the video isn’t you. It doesn’t matter that you never taped the video. She sure looks like you — she shares your face, your voice, and, to an untrained eye, your body. With advances in AI, our human features — the ones that make us unique — can now be mimicked by an algorithm, and this scenario I’ve laid out is a nightmare that’s becoming a reality for women and girls around the world.

I’ve been covering the human impact of technology for 15 years, and I can’t stop thinking about deepfake pornography and the threat it poses to all of us. Most of the time, when anyone hears “deepfake pornography” their eyes glaze over. (I learned the awkward way that deepfake pornography isn’t dinner party small talk.) Honestly, I understand why: The concept seems far off. AI-generated sexually explicit images aren’t exactly top of mind for most of us. Sure we’re aware of the threats of artificial intelligence, but we’re more concerned with the issues we face every day, like AI scams and robocalls.

But the extreme act hit a little closer to home when it happened to one of the most famous and worshiped women in the world: Taylor Swift. In the last weeks, sexually explicit deepfake photos of Swift began circulating on X (formerly Twitter). They were viewed millions upon millions of times before X took action and temporarily blocked searches of the superstar — even those unrelated to the porn.  

It’s not just Taylor Swift 

Whether or not you’re a swiftie (full disclosure: I am), and unless you’ve been living in an unprecedented media blackout, you’re familiar with her impact. Swift’s power is the ability to create songs and narratives that are so specific yet universal. We see ourselves in her lyrics.  

Like a lot of what Swift creates, this debacle is a reflection of what many other women are experiencing or might experience in the future. What happened to Swift, while incredibly personal, represents a threat all women and girls face in the AI race to innovate, crystalizing a hard truth: This threat is coming to your communities.

It’s already happening in high schools around the world. Recently, a New Jersey teen spoke out, calling for legislation after she and her classmates were victims of sexually explicit deepfakes that were created without her consent. 

With the rise of these apps that gamify and incentivize sexually explicit deepfakes, we are standing at the edge of a very dangerous cliff.

I’ve been investigating this issue for the last year, and unfortunately, it’s even worse than you think. Open forums on the internet now allow people to share tips, code, and apps to enable this behavior. I recently spoke to a security researcher, TrustedSec’s David Kennedy, who told me about forums where users exchanged notes on how to create personalized sexually explicit deepfake videos of their colleagues by recording company Zoom calls. There are forums online like deepfake porn site Mr. Deepfakes — a platform that millions visit monthly — where users are encouraged to request and create personalized deepfake pornography. 

When I first started covering tech, it was during the mobile revolution in which entrepreneurs were building apps for just about everything. Need a ride? There’s an app for that. Need groceries? We’ve got you covered. But in the race to innovate in an AI revolution, it’s now getting easier to perpetuate this type of abuse. We’re seeing the democratization of AI image-generating apps that create a culture of “your wish is AI’s command” technology. Now, there’s an app that can turn your likeness into deepfake pornography in a couple of clicks. Want to digitally undress someone? There’s an app for that. Did your crush reject you? No big deal, in less than a minute you can create a personalized sexually explicit deepfake video of her.

And now, with the rise of these apps that gamify and incentivize sexually explicit deepfakes, we are standing at the edge of a very dangerous cliff. If a young boy who’s rejected can create a personalized sexually explicit image of their crush without their permission, what message does that send? Without the correct guardrails and public discourse on AI image-generating products, we’re telling a generation of young men that it’s OK to dehumanize women and strip them of their consent. 

Michael Matias, an entrepreneur building deepfake detection technology, has an unsettling warning: “This is going to happen to everybody. The simple fact is that anybody who has some imagination and agency — even a 15-year-old at home just trying out some apps — will be able to create these types of media using our kids, friends, and other family members.”

Sexually explicit deepfakes represent the beginning of a new reality in which we can no longer trust what we see and hear, but the implications of this move beyond women and girls into democracy, misinformation, and public safety.

It’s fake porn, but real damage 

You may say, but Laurie, those images aren’t actually real, and everyone knows that. That’s not true: With advances in AI, it’s hard to distinguish what’s real and what’s not. Truth and perception of truth aren’t so far apart. Even if viewers know it’s fake, having people see such a vulnerable version of yourself creates shame, humiliation, and reputational damage that is genuine. And if I know anything about technology, it’s this: Abuse that happens online follows us into the real world. The behavior we incentivize online will inevitably impact us offline.

This shouldn’t surprise us. Anytime we see tech innovation, we’re exposed to new unintended consequences. Sexually explicit deepfakes represent the beginning of a new reality in which we can no longer trust what we see and hear, but the implications of this move beyond women and girls into democracy, misinformation, and public safety. Think of deepfake pornography as a testing ground for bad actors online and across the world, Matias suggested.

“Now we see a massive and fast adoption of deepfakes transitioning from pornography to elections, to the day-to-day world, and a faster progression than what we’ve seen before,” Matias explained. What once seemed like a fringe threat now has implications none of us can look away from.  

A little over a week after sexually explicit deepfakes of Swift went viral, new deepfake images emerged on X showing Swift supporting Donald Trump and pushing election conspiracy theories. Those images also went viral before X the company took action. 

What needs to happen next

The quicker tech companies can implement technology to label, detect, and stop the spread of sexually explicit deepfakes, the better all of us will be. In the last week, A spokesperson from Meta said the company would start rolling out capabilities to detect AI-generated content over the next months, but the reality is that we’re in a race right now, and we’re already falling behind. Before Swift was targeted, lawmakers introduced a bipartisan federal bill that would include criminal and civil penalties called the Preventing Deepfakes of Intimate Images Act, which would provide victims of this type of abuse recourse. A handful of states have laws against the sharing of sexually explicit deepfakes, but they vary in scope, oftentimes creating a burden on the victim. 

If we look at the last waves of tech innovation, there’s a pattern of unintended consequences that arise from the speed at which we innovate. I’ve sat across from tech founders my whole career and the one thing they consistently get wrong is fully anticipating the human impact of their products. And more often than not, these unintended consequences of innovation disproportionately impact women and girls.

This isn’t a new story, but the good news is the ending of this story has yet to be written. With tech innovation, we have a tremendous opportunity. That said, a future coded without consent over our bodies online isn’t a future we should opt into. With education, regulation, and, ironically, technology as a solution to the harms created by it, we can have agency over a safer internet for all of us. 

While it doesn’t seem fair to ask Swift to take on this challenge, I strongly believe that if she were to take action against the perpetrators of this harm, she could help create meaningful change, setting a precedent for a safer internet for all of us. Through her work, Swift has created a powerful coalition of fans, and micro-economies and fundamentally shifted the music industry by fighting for ownership of her songs. Justice for Taylor Swift could mean a future where we have ownership over our bodies, online. 


Laurie Segall is a longtime tech journalist and the founder of Mostly Human, an entertainment company that produces docs, films, and digital content focused on the intersection of technology and humanity. She is the author of Special Characters: My Adventures with Tech’s Titans and Misfits. Previously, she was CNN’s senior technology correspondent.