“Dehumanizing, degrading”: One woman’s fight against deepfake porn

FAN Editor

Watch the CBSN Originals documentary “Deepfakes and the Fog of Truth” in the video player above. It premieres on CBSN Sunday, Oct. 17, at 8 p.m., 11 p.m. and 2 a.m. ET.


When Australian law student Noelle Martin was 18, she did what all people who’ve grown up with the internet do: she Googled herself. But rather than finding photos of her family vacations, she was shocked to find explicit photos of herself posted to pornographic sites. Martin, however, never took those photos. Her face had been edited on the bodies of adult film actresses and posted online. 

“I saw images depicting me having sexual intercourse, images of me in solo positions where my face was doctored onto the naked bodies of adult actresses,” she said.

Martin’s social media accounts had been used to gather photos that were then used to create the realistic-looking graphic images. She did not know who created the images or why she was targeted, but these graphic images continued to appear on fetish websites.  

“I am not a public figure, I’m not a celebrity. I’m literally just an ordinary person, literally a nobody from Perth, Western Australia,” Martin told CBSN Originals. “And I didn’t have any partners or ex-partners who might’ve done something like this to me.”

Martin tried contacting the police, private investigators and government agencies, but because she didn’t know where the images originated there was no way to hold the creators accountable. Martin even attempted to contact the operators of the the porn sites that hosted the pornographic photos of her, but those efforts sometimes led to more abuse. 

“Sometimes I’d get a response and they’d remove it, and then it’ll pop up two weeks later,” she said. “And then one time, one perpetrator said that they’d only remove the material if I sent them nude photos of myself within 24 hours.”

noelle-martin.jpg
Noelle Martin has spent years battling the spread of deepfake porn online. “It was completely horrifying, dehumanizing, degrading, violating to just see yourself being misrepresented and being misappropriated in that way,” she said. CBS News

Martin’s efforts to scrub the internet of the non-consensual porn depicting her were unsuccessful, and the images continued to escalate. Martin says she received an email at work from someone anonymously telling her there were now videos of her having sex posted on internet porn sites. Those videos were deepfakes — manipulated forms of media that used sophisticated digital tools to edit Martin’s face on the body of an adult film actress.

“It was completely horrifying, dehumanizing, degrading, violating to just see yourself being misrepresented and being misappropriated in that way,” Martin said.

In recent years, improvements in artificial intelligence have allowed for cheaper and more accessible technology in the world of visual effects, enabling the creation of deepfakes, where a person in a video can be replaced by someone else, nearly replicating their likeness and movements. While these innovations have democratized the creation of special effects, the technology has been largely used to attack women.

According to a 2019 report by the cybersecurity company Deeptrace, 96% of all deepfakes online are pornographic and the top five deepfake pornography websites exclusively target women. 

Adam Dodge, an attorney and the founder of endtab, an organization that works towards ending technology-enabled abuse, says deepfakes have been used to target women since the technology was released to the masses. 

“In early 2018 we were made aware of deepfakes by being exposed to celebrity deepfake pornography,” Dodge said. “So there wasn’t even a chance to look at it through a different lens of, ‘Hey, this could be used in movie editing or inserting somebody into a Star Wars film.’ It was, ‘No, this is a weaponized form of technology used to harm women and girls online.'”

According to Dodge, the legislative system has been slow to react to the threat women face from deepfakes. 

“In most states, non-consensual pornography is illegal,” he explained. “However, if you create deepfake non-consensual pornography, those laws are not going to apply because it’s not the victim’s body being portrayed in the video. It’s just their face. So the video won’t meet the threshold to be prosecuted under that law.”

The owner of one of the internet’s most popular deepfake porn sites, who goes by the pseudonym Dom, says his site exclusively posts non-consensual deepfake porn of celebrities. 

“I don’t feel bad for celebrities” Dom said. “I think that, as a public figure, they’re more equipped with this happening. I think they know it, too, about people fantasizing about them.” 

Dom, who created his site after Reddit banned its deepfake subreddit for breaking its rules against involuntary pornography, says his site gets approximately 350,00 visitors per day. And while the main page of his site is focused on public figures, he also hosts a forum which serves as a marketplace where deepfake creators can connect with people who make requests for deepfakes of non-celebrities. 

“Anything could go behind the scenes and I wouldn’t know or moderators wouldn’t know,” Dom said. “It’s hard to police everything.”

Dom says he feels bad for non-celebrities who become victims of non-consensual deepfake porn, and says despite the difficulty in policing the forums he will ban users if he finds out they are making videos of non-celebrities. But Dom believes the videos he does post are clearly fake and therefore he has no hesitation about keeping his site online. 

“If users make sure that people know that it’s fake and it’s clearly labeled as a fake for entertainment purposes, as long as they don’t try to pass it off as the real thing, that’s basically where my cutoff is,” he said.

But for victims of deepfake porn, the acknowledgement that the media has been manipulated does not lessen the impact. 

“Ultimately they’re being fetishized and sexualized without their consent,” Dodge said. “And regardless of whether the video or the photo that’s being distributed or shared is believed by the viewer, it is tremendously harmful because it just doesn’t feel good to have that out there.”

Martin says being a victim of image-based abuse changed the trajectory of her life. 

“It robs you of opportunities, and it robs you of your career, and your hopes and your dreams,” she said. “I have been admitted as a lawyer, and that’s the only thing that I’ve ever wanted to do. And particularly in that area where it’s all about name and image and reputation, it’s been extremely hard for me to find employment.” 

Since Martin first found doctored images of herself on Google, she has become an advocate for victims of similar forms of non-consensual pornography. She says her outspokenness made her an even greater target of abuse.

“Because I was speaking out about it, the perpetrators decided to create fake videos of me,” Martin said. “You only seek to lose when you talk about something like this, because when you dare to speak about this kind of abuse, you expose yourself to more people seeing the very thing you don’t want people to see.”

Despite attempts to silence her, Martin has remained an outspoken activist. She advocated for legal solutions that led Australia to make image-based abuse a criminal offense in 2017. But these years have taken a toll.

“It’s just been such a life-shattering experience,” Martin said. “And I actually think it’s just caused me more pain than it has actually given me strength or resilience, as much as that’s not a great way to look at the world. It really has just almost destroyed me.”

Free America Network Articles

Leave a Reply

Next Post

Number of first-time jobless claims hits pandemic low

The number of Americans filing for initial unemployment claims has fallen below 300,000 for the first time since the pandemic began. Some 293,000 first-time applications for unemployment benefits were filed last week, the Labor Department said Thursday. That’s the smallest number of people to apply for benefits since March 2020, […]