Resources

A Deepfake Porn Bot Is Being Used to Abuse Thousands of Women

An AI tool that “removes” items of clothing from photos has targeted more than 100,000 women, some of whom appear to be under the age of 18.

The still images of nude women are generated by an AI that “removes” items of clothing from a non-nude photo. Every day the bot sends out a gallery of new images to an associated Telegram channel which has almost 25,000 subscribers. The sets of images are frequently viewed more 3,000 times. A separate Telegram channel that promotes the bot has more than 50,000 subscribers.

Some of the images produced by the bot are glitchy, but many could pass for genuine. “It is maybe the first time that we are seeing these at a massive scale,” says Giorgio Patrini, CEO and chief scientist at deepfake detection company Sensity, which conducted the research. The company is publicizing its findings in a bid to pressure services hosting the content to remove it, but it is not publicly naming the Telegram channels involved.

The actual number of women targeted by the deepfake bot is likely much higher than 104,000. Sensity was only able to count images shared publicly, and the bot gives people the option to generate photos privately. “Most of the interest for the attack is on private individuals,” Patrini says. “The very large majority of those are for people that we cannot even recognize.”

As a result, it is likely very few of the women who have been targeted know that the images exist. The bot and a number of Telegram channels linked to it are primarily Russian-language but also offer English-language translations. In a number of cases, the images created appear to contain girls who are under the age of 18, Sensity adds, saying it has no way to verify this but has informed law enforcement of their existence.

Unlike other nonconsensual explicit deepfake videos, which have racked up millions of views on porn websites, these images require no technical knowledge to create. The process is automated and can be used by anyone—it’s as simple as uploading an image to any messaging service.

621

37-Year-Old Mom Finds Instagram’s Sex Predators By Pretending To Be 11

Sloane Ryan is a 37-year-old woman who runs the Special Projects Team at Bark, a child-safety tech company selling a $9-a-month software that monitors text messages for bullying, threats of violence, depression, and sexual predators. “In 2018 alone, Bark alerted the FBI to 99 child predators. In 2019? That number is more than 300 — and counting.”

Bark had wanted a way to depict the problem to the public without using actual conversations — so Ryan began posing as an underage minor on Instagram.
Over the past nine months, I’ve been 15-year-old Libby and 16-year-old Kait and 14-year-old Ava. I’ve been a studious sophomore contemplating bangs and a lacrosse player being raised by her aunt and an excitable junior eager for prom….

At the beginning of the week, on the very first night as [11-year-old] “Bailey” two new messages came in within 52 seconds of publishing a photo. We sat mouths agape as the numbers pinged up on the screen — 2, 3, 7, 15 messages from adult men over the course of two hours. Half of them could be charged with transfer of obscene content to a minor. That night, I had taken a breather and sat with my head in my hands.

The second half of the article includes examples of particularly graphic conversations with what the perpetrators think are an 11-year-old girl instead of the 37-year-old woman who’s investigating them. “I exit the conversation with @ XXXastrolifer to see another nine requests pending… Over the course of one week, over 52 men reached out to an 11-year-old girl.”

692

Deepfake Porn Is Total Control Over Women’s Bodies

A lineup of female celebrities stand in front of you. Their faces move, smile, and blink as you move around them. They’re fully nude, hairless, waiting for you to decide what you’ll do to them as you peruse a menu of sex positions. This isn’t just another deepfake porn video, or the kind of interactive, 3D-generated porn Motherboard reported on last month, but a hybrid of both which gives people even more control of women’s virtual bodies. This new type of nonconsensual porn uses custom 3D models that can be articulated and animated, which are then made to look exactly like specific celebrities with deepfaked faces. Until recently, deepfake porn consisted of taking the face of a person — usually a celebrity, almost always a woman — and swapping it on to the face of an adult performer in an existing porn video. With this method, a user can make a 3D avatar with a generic face, capture footage of it performing any kind of sexual act, then run that video through an algorithm that swaps the generic face with a real person’s.

654

An AI-Powered App Has Resulted in an Explosion of Convincing Face-Swap Porn

In December, Motherboard discovered a Redditor named ‘deepfakes’ quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities — including Gal Gadot, Maisie Williams, and Taylor Swift — using a machine learning algorithm, his home computer, publicly available videos, and some spare time. Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. A redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

883

Put It in The Robot

742