Fake-porn videos are being weaponized to harass and humiliate women: ‘Everybody is a potential target’
“Deepfake” creators are making disturbingly realistic, computer-generated videos with photos taken from the Web, and ordinary women are suffering the damage.
The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.
It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online.
She felt nauseous and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?
“I feel violated — this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career. “It’s this weird feeling, like you want to tear everything off the Internet. But you know you can’t.”
Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deepfake” videos have quickly multiplied across the Internet, blurring the line between truth and lie.
But the videos have also been weaponized disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse. The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect. And although their legality hasn’t been tested in court, experts say they may be protected by the First Amendment — even though they might also qualify as defamation, identity theft or fraud.
Disturbingly realistic fakes have been made with the faces of both celebrities and women who don’t live in the spotlight, and the actress Scarlett Johansson says she worries that “it’s just a matter of time before any one person is targeted” by a lurid forgery.
Johansson has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the Web: One video, falsely described as real “leaked” footage, has been watched on a major porn site more than 1.5 million times. She said she worries it may already be too late for women and children to protect themselves against the “virtually lawless (online) abyss.”
“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause. . . . The Internet is a vast wormhole of darkness that eats itself.”
In September, Google added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.” But there’s no easy fix to their creation and spread.
A growing number of deepfakes target women far from the public eye, with anonymous users on deepfakes discussion boards and private chats calling them co-workers, classmates and friends. Several users who make videos by request said there’s even a going rate: about $20 per fake.
The requester of the video with the woman’s face atop the body with the pink off-the-shoulder top had included 491 photos of her face, many taken from her Facebook account, and told other members of the deepfake site that he was “willing to pay for good work :-).” A Washington Post reporter later found her by running those portraits through an online tool known as a reverse-image search that can locate where a photo was originally shared.
It had taken two days after the request for a team of self-labeled “creators” to deliver. A faceless online audience celebrated the effort. “Nice start!” the requester wrote.
“It’s like an assault: the sense of power, the control,” said Adam Dodge, the legal director of Laura’s House, a domestic-violence shelter in California. Dodge hosted a training session last month for detectives and sheriff’s deputies on how deepfakes could be used by an abusive partner or spouse. “With the ability to manufacture pornography, everybody is a potential target,” Dodge said.
Videos have for decades served as a benchmark for authenticity, offering a clear distinction from photos that could be easily distorted. Fake video, for everyone except high-level artists and film studios, has always been too technically complicated to get right.
But recent breakthroughs in machine-learning technology, employed by creators racing to refine and perfect their fakes, have made fake-video creation more accessible than ever. All that’s needed to make a persuasive mimicry within a matter of hours is a computer and a robust collection of photos, such as those posted by the millions onto social media every day.
Read more via Washington Post