Get ready to overthink your decision to share photos online.
As artificial intelligence (also known as AI) becomes, well, more intelligent, the rising sophistication of AI has led to heightened anxiety about the technology. AI used to be a tool exclusively available to computer scientists, and now it’s largely available to the public — and often for free. That means, among other things, that the average person now has a type of power that might have been unimaginable even 10 or 15 years ago.
This power can, and likely will, be used for so many wonderful things. But it can, and already is, being used for truly terrible things, too. That includes the emergence of technology like AI porn. And making it often involves someone stealing the likenesses of real people — without their consent or even their awareness.
We checked in with privacy law expert Leah Plunkett, the author of Sharenthood and faculty member at Harvard Law School, who explained you need to know about how artificial intelligence has infiltrated the porn industry, why it matters for everyday citizens, and (most importantly) what steps you can take to protect your privacy online.
The rise of artificial intelligence in porn
Porn created via artificial intelligence isn’t necessarily new. In fact, it’s been around for years — but what is new (and redefining the landscape at a terrifying pace) is how much more accessible AI tools have become, enabling just about anyone to create their own porn video, if they feel like it. As The Washington Post notes, any computer-savvy person can take pictures of their ex’s face and use AI to paste that face into explicit images, or even explicit videos.
On the web, you might already stumble upon a video of a person saying something they never actually said, or a picture of someone standing in a state they’ve never visited. That’s called a deepfake: a piece of synthetic media, like a picture or a video, in which a person is replaced with another person’s likeness.
Deepfakes have been a topic of conversation for a while now: You might have seen the news covering the disturbing trend of deepfakes involving politicians (former President Barack Obama in particular has been a frequent target of these videos). But what you probably don’t know is that 96 percent of deepfakes on the internet were some form of pornography as of 2019 — and nearly all of those deepfakes were of women.
Just a month ago, a well-known Twitch streamer (known for creating wholesome content involving video games and baking) spoke out about being a victim of a disturbing deepfake incident. Known only as QTCinderella, the previously anonymous gamer decided to show her face in a live stream after she’d found her likeness, as well as her brand, used in porn videos without her awareness or consent.
“I wanted to show this is a big deal,” she said. “Every single woman on that website, this is how they feel. Stare at me sobbing, and tell me you still think this is OK.”
Why you’re at risk of “appearing” in AI porn or graphic deepfakes without your consent
In 2023, there are a number of ways that your likeness might show up in graphic videos or pictures without your knowledge or consent. The first way is one we mentioned above: Someone you know personally could take a picture or video of you and alter it, using one of the many readily available (and mostly free) AI tools, which they could find online and in the app store.
Perhaps more unnervingly, someone could use pictures of you that you aren’t even aware are available on the internet. For example, there’s an AI site called Pimeyes that allows a person to upload a single picture of another person, and then see every other picture of that person that the algorithm can find online.
Numerous users have reported how uncannily efficient and skilled this software is. One adult woman reported uploading a picture of herself in the present, only to receive a picture of herself at 14 years old that the AI tool was somehow able to dig up.
As the internet gets bigger and wilder, with more and more sites run by highly sophisticated algorithms, it can be nearly impossible for any person to track down every bit of their data that exists online.
All of this means, unfortunately, that there are likely a whole lot of pictures of you floating around in the digital ether — but there are a few things you can do to protect yourself.
Is it possible to protect your likeness from AI deepfakes?
Right now, there aren’t many laws that explicitly protect individuals from deepfakes — but there are certain steps you can take to limit the chances that your likeness will be used without your consent.
For the website Pimeyes, you can actually file an “opt-out” request that bans the site from searching for your likeness. It’s unfortunately pretty time-intensive, and requires you to submit forms of identification like a passport in order to verify your identity — but that might be worth it, if you want to ensure your identity stays protected from those who use the site.
In terms of more widescale solutions, there’s good news and bad news. Here’s the bad: It takes a lot of painstaking work to remove pictures of yourself that already exist on the internet.
The good news: There are plenty of easy ways to ensure that future photos don’t make their way onto the internet — and there are a number of steps you can take to advocate for better privacy laws in the future.
“When it comes to AI and facial recognition, in particular, we are in a space where the laws do not provide us with enough privacy protection,” Plunkett says. For that reason among others, she encourages you to think of any mobile device, app, or website as “a really cool car…but one in which the brakes may not work, and that doesn’t come with a very good driver’s manual.”
When you download an app, are you aware of the terms and conditions? Are you certain that it can’t “use” your photos, texts, or other personal data in any way, shape, or form? Consider those questions before you provide any additional info.
If you want to see what’s happening with privacy laws and internet protections, Plunkett suggests you look at the state level of government, rather than the federal. “Get yourself acquainted with what’s happening in your state,” she encourages. “And if you already have good, cutting-edge data privacy laws and regulations in your state, then think about learning a little bit more about how they can apply in your kid’s school, your workplace, or restaurants or community spaces.”
One silver lining to this whole mess is that privacy laws are a bipartisan issue, Plunkett says.
“This is especially true at the state level,” she says. “Because when we’re talking to our state, elected, and appointed officials, they are our neighbors and community members. And they also want to live in a jurisdiction that respects their privacy.”