Resources

37-Year-Old Mom Finds Instagram’s Sex Predators By Pretending To Be 11

Sloane Ryan is a 37-year-old woman who runs the Special Projects Team at Bark, a child-safety tech company selling a $9-a-month software that monitors text messages for bullying, threats of violence, depression, and sexual predators. “In 2018 alone, Bark alerted the FBI to 99 child predators. In 2019? That number is more than 300 — and counting.”

Bark had wanted a way to depict the problem to the public without using actual conversations — so Ryan began posing as an underage minor on Instagram.
Over the past nine months, I’ve been 15-year-old Libby and 16-year-old Kait and 14-year-old Ava. I’ve been a studious sophomore contemplating bangs and a lacrosse player being raised by her aunt and an excitable junior eager for prom….

At the beginning of the week, on the very first night as [11-year-old] “Bailey” two new messages came in within 52 seconds of publishing a photo. We sat mouths agape as the numbers pinged up on the screen — 2, 3, 7, 15 messages from adult men over the course of two hours. Half of them could be charged with transfer of obscene content to a minor. That night, I had taken a breather and sat with my head in my hands.

The second half of the article includes examples of particularly graphic conversations with what the perpetrators think are an 11-year-old girl instead of the 37-year-old woman who’s investigating them. “I exit the conversation with @ XXXastrolifer to see another nine requests pending… Over the course of one week, over 52 men reached out to an 11-year-old girl.”

Deepfake Porn Is Total Control Over Women’s Bodies

A lineup of female celebrities stand in front of you. Their faces move, smile, and blink as you move around them. They’re fully nude, hairless, waiting for you to decide what you’ll do to them as you peruse a menu of sex positions. This isn’t just another deepfake porn video, or the kind of interactive, 3D-generated porn Motherboard reported on last month, but a hybrid of both which gives people even more control of women’s virtual bodies. This new type of nonconsensual porn uses custom 3D models that can be articulated and animated, which are then made to look exactly like specific celebrities with deepfaked faces. Until recently, deepfake porn consisted of taking the face of a person — usually a celebrity, almost always a woman — and swapping it on to the face of an adult performer in an existing porn video. With this method, a user can make a 3D avatar with a generic face, capture footage of it performing any kind of sexual act, then run that video through an algorithm that swaps the generic face with a real person’s.

An AI-Powered App Has Resulted in an Explosion of Convincing Face-Swap Porn

In December, Motherboard discovered a Redditor named ‘deepfakes’ quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities — including Gal Gadot, Maisie Williams, and Taylor Swift — using a machine learning algorithm, his home computer, publicly available videos, and some spare time. Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. A redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

Put It in The Robot