Story

People Can Put Your Face on Porn—and the Law Can't Help You

The grosser parts of the internet have a new trick: Using machine learning and AI to swap celebrities’ faces onto porn performers’. The result? Fake celebrity porn seamless enough to be mistaken for the real thing. Early victims include Daisy Ridley, Gal Gadot, Scarlett Johansson, and Taylor Swift. Originally reported by Motherboard, this nasty trend has been brewing for months, acquiring its own subreddit. And now that someone has made an app—drastically lowering the technical threshold would-be creators have to clear— it’s presumably about to become much more prevalent.

For reasons that are eye-poppingly obvious, these videos—which their creators refer to as "deepfakes," after the redditor who created the process—are terrible. It’s a noxious smoothie made of some of today's worst internet problems. It’s a new frontier for nonconsensual pornography and fake news alike. (Doctored videos of political candidates saying outlandish things in 3, 2… .) And worst of all? If you live in the United States and someone does this with your face, the law can’t really help you.

To many vulnerable people on the internet, especially women, this looks a whole lot like the end times. “I share your sense of doom,” Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. “I think it is going to be that bad.”

She should know. Franks helped write much of the US’s existing legislation that criminalizes nonconsensual porn—and it's not going to help. It’s not that Franks and lawmakers weren’t thinking about the implications of manipulated images. It’s that the premise of any current legislation is that nonconsensual porn is a privacy violation. Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue. That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.

And it's the very artifice involved in these videos that provides enormous legal cover for their creators. “It falls through the cracks because it’s all very betwixt and between,” says Danielle Citron, a law professor at the University of Maryland and the author of Hate Crimes in Cyberspace. “There are all sorts of First Amendment problems because it’s not their real body.” Since US privacy laws don’t apply, taking these videos down could be considered censorship—after all, this is “art” that redditors have crafted, even if it’s unseemly.

In case after case, the First Amendment has protected spoofs and caricatures and parodies and satire. (This is why porn has a long history of titles like Everybody Does Raymond and Buffy the Vampire Layer.) According to Citron, claiming that face-swap porn is parody isn't the strongest legal argument—it's clearly exploitative—but that’s not going to stop people from muddying the legal waters with it.

So What Now?

Does that mean that victims have zero hope of legal recourse? Not exactly. Celebrities will be able to sue for the misappropriation of their images. But that usually applies to commercial contexts—like, say, if someone took a social media photo of Gal Gadot’s and then used it to promote a strip club without her consent—and commercial speech doesn’t have nearly the protection individual citizens’ does.

For the average citizen, your best hope is anti-defamation law. When Franks realized that revenge porn law wouldn't include language about false images, she recommended that lawmakers update their anti-defamation statutes to handle it—but in many cases, that hasn’t happened yet. And Franks thinks claimants will have difficulty proving that the creators intended to cause them emotional distress. So far, these videos do seem to have been created for the pleasure of the creator rather than the humiliation of the object of their desire. “Inevitably, someone will point out how many young men had posters of Princess Leia in their bedrooms as a masturbation fantasy,” Franks says. “Is the harm just that you found out about? Legally, we need to be able to articulate what is the harm, not just that it makes us feel icky.” And in such a fringe case as AI-enabled porn, that hasn’t happened yet.

For a longer-term solution, the most viable way to stem the tide of face-swap porn may be to take away the distributive technology: that app. The Federal Trade Commission Act prohibits “unfair or deceptive acts or practices in or affecting commerce," a distinction that could become the point of attack. “If we can think about it creatively, the app creator could be liable,” Citron says. “The app is using someone’s data and morphing it onto someone else’s.” (Google ran afoul of the same stipulation in 2013.)

The US has no "right to be forgotten" statues, which allow private citizens to petition to have online material involving them taken down. But temporary solutions do exist. Google has said it will delink nonconsensual porn from searches for the victim’s name.

Similarly, online platforms will have the ability to step up and swat them them down—or at least mark them as faked. “AI to detect those kinds of edited videos exists and is pretty good. Separately, detecting porn is pretty straightforward,” says Jen Golbeck, a computer scientist at the University of Maryland. “So for someplace like YouTube to do this, they would need to combine the the porn detector they certainly have with an edited video detector. The technology should not be hard.” Given the fact that the videos are coming from a limited selection of apps, there might be an even clearer signature to hone in on.

Verifying the authenticity of videos (or the lack thereof) will only become more important as this technology proliferates. So will confronting what constitutes defamation, emotional harm, and consent in digital contexts. But while all of that is critical, it will require enormous struggle to make sure it’s actually contemplated. “If we can’t get to the point where it’s not okay for a boss to proposition a secretary, how are we going to rethink this?” Franks says. “We’re so far away from the conversation we need to be having.” For these early victims—as with the early victims of revenge porn—cultural concern has already arrived too late.

Leave a Reply

Your email address will not be published. Required fields are marked *