YouTube’s Biggest Stars Are Pushing a Shady Polish Gambling Site

Untold riches are promised on Mystery Brand, a website that sells prize-filled “mystery boxes.” If you buy one of the digital boxes, some of which cost hundreds of dollars, you might only get a fidget spinner — or you might get a luxury sports car. For just $100, users can win a box filled with rare Supreme streetwear. For only $12.99, they can win a Lamborghini, or even a $250 million mega-mansion billed as “the most expensive Los Angeles realty.” Or at least that’s what some top YouTubers have been telling their young fans about the gambling site — with the video stars apparently seeing that as a gamble worth taking, especially after a dip in YouTube advertising rates.

Over the past week, hugely popular YouTube stars like Jake Paul and Bryan “Ricegum” Le have encouraged their fans to spend money on Mystery Brand, a previously little-known site that appears to be based in Poland. In their videos, Paul and Le show themselves betting hundreds of dollars on the site for a chance to open a digital “box.” At first, they win only low-value prizes like fidget spinners or Converse sneakers. By the end of the video, though, they have won thousands of dollars worth of tech and clothing, like rare pairs of sneakers or Apple AirPods. If they like the prize, the YouTube stars have it shipped to their house.
The gambling site doesn’t list the owner or location where it’s based, although the site’s terms of service say it’s “subject to the laws and jurisdiction of Poland.” To make matters worse, users of the site might not even receive the items they believed they have won. “During using the services of the website You may encounter circumstances in which Your won items will not be received,” the terms of service reads.

Also, while the ToS say that underage users are ineligible to receive prizes, many of the YouTubers promoting the site have audiences who are underage. “[Jake Paul], for example, has acknowledged that the bulk of his fanbase is between 8 and 15 years old,” reports The Daily Beast.

“How YouTube’s Year-in-Review ‘Rewind’ Video Set Off a Civil War”

You might guess that a surefire way to make a hit video on YouTube would be to gather a bunch of YouTube megastars, film them riffing on some of the year’s most popular YouTube themes and release it as a year-in-review spectacular. You would be wrong.

The issue that upset so many YouTube fans, it turns out, was what the Rewind video did not show. To many, it felt like evidence that YouTube the company was snubbing YouTube the community by featuring mainstream celebrities in addition to the platform’s homegrown creators, and by glossing over major moments in favor of advertiser-friendly scenes.

If YouTube had been trying to create an accurate picture of its platform’s most visible faces, it would need to include bigots, reactionaries and juvenile shock jocks. A YouTube recap that includes only displays of tolerance and pluralism is a little like a Weather Channel highlight reel featuring only footage of sunny days — it might be more pleasant to look at, but it doesn’t reflect the actual weather.

YouTube’s Top-Earner For 2018 Is a 7-Year-Old

In 2018 the most-downloaded iPhone app was YouTube, reports USA Today, while Amazon’s best-selling item was their Fire TV Stick for streaming video. The No. 1 earner on YouTube this year is 7-year-old Ryan. For all those unboxing videos and playing with toys — and his own new line of toys at Walmart — he and his family will pull in a cool $22 million, according to Forbes. Ryan launched the channel in 2015 — when he was four — and now has 17.3 million followers.

A Look at the Dark Side of the Lives of Some Prominent YouTubers, Who Are Increasingly Saying They’re Stressed, Depressed, Lonely, and Exhausted

Many YouTubers are finding themselves stressed, lonely and exhausted. For years, YouTubers have believed that they are loved most by their audience when they project a chirpy, grateful image. But what happens when the mask slips? This year there has been a wave of videos by prominent YouTubers talking about their burnout, chronic fatigue and depression. “This is all I ever wanted,” said Elle Mills, a 20-year-old Filipino-Canadian YouTuber in a (monetised) video entitled Burnt Out At 19, posted in May. “And why the fuck am I so unfucking unhappy? It doesn’t make any sense. You know what I mean? Because, like, this is literally my fucking dream. And I’m fucking so un-fucking-happy.”

… The anxieties are tied up with the relentless nature of their work. Tyler Blevins, AKA Ninja, makes an estimated $500,000 every month via live broadcasts of him playing the video game Fortnite on Twitch, a service for livestreaming video games that is owned by Amazon. Most of Blevins’ revenue comes from Twitch subscribers or viewers who provide one-off donations (often in the hope that he will thank them by name “on air”). Blevins recently took to Twitter to complain that he didn’t feel he could stop streaming. “Wanna know the struggles of streaming over other jobs?” he wrote, perhaps ill-advisedly for someone with such a stratospheric income. “I left for less than 48 hours and lost 40,000 subscribers on Twitch. I’ll be back today… grinding again.” There was little sympathy on Twitter for the millionaire. But the pressure he described is felt at every level of success, from the titans of the content landscape all the way down to the people with channels with just a few thousand subscribers, all of whom feel they must be constantly creating, always available and responding to their fans.

At the end of the month he was pale, gaunt and tired in a way that, he recalls, seemed “impervious to rest”. His work, he noticed, had become increasingly rushed and harsh in tone. Yet the angry, provocative quality of his videos seemed only to be making them more popular. “Divisive content is the king of online media today, and YouTube heavily boosts anything that riles people up,” he says. “It’s one of the most toxic things: the point at which you’re breaking down is the point at which the algorithm loves you the most.”

“Constant releases build audience loyalty,” says Austin Hourigan, who runs ShoddyCast, a YouTube channel with 1.2 million subscribers. “The more loyalty you build, the more likely your viewers are to come back, which gives you the closest thing to a financial safety net in what is otherwise a capricious space.” When a YouTuber passes the 1 million subscribers mark, they are presented with a gold plaque to mark the event. Many of these plaques can be seen on shelves and walls in the background of presenters’ rooms. In this way, the size of viewership and quantity of uploads become the main markers of value.

Top YouTube creators burn out mentally

Three weeks ago, Bobby Burns, a YouTuber with just under one million subscribers, sat down on a rock in Central Park to talk about a recent mental health episode. One week ago, Elle Mills, a creator with more than 1.2 million subscribers, uploaded a video that included vulnerable footage during a breakdown. Six days ago, Rubén “El Rubius” Gundersen, the third most popular YouTuber in the world with just under 30 million subscribers, turned on his camera to talk to his viewers about the fear of an impending breakdown and his decision to take a break from YouTube.

Burns, Mills and Gundersen aren’t alone. Erik “M3RKMUS1C” Phillips (four million subscribers), Benjamin “Crainer” Vestergaard (2.7 million subscribers) and other top YouTubers have either announced brief hiatuses from the platform, or discussed their own struggles with burnout, in the past month. Everyone from PewDiePie (62 million subscribers) to Jake Paul (15.2 million subscribers) have dealt with burnout. Lately, however, it seems like more of YouTube’s top creators are coming forward with their mental health problems.

Constant changes to the platform’s algorithm, unhealthy obsessions with remaining relevant in a rapidly growing field and social media pressures are making it almost impossible for top creators to continue creating at the pace both the platform and audience want — and that can have a detrimental effect on the very ecosystem they belong to.

Screen watching at all-time high

With Netflix and Amazon Prime, Facebook Video and YouTube, it’s tempting to imagine that the tech industry destroyed TV. The world is more than 25 years into the web era, after all, more than half of American households have had home Internet for 15 years, and the current smartphone paradigm began more than a decade ago. But no. Americans still watch an absolutely astounding amount of traditional television.

In fact, television viewing didn’t peak until 2009-2010, when the average American household watched 8 hours and 55 minutes of TV per day. And the ’00s saw the greatest growth in TV viewing time of any decade since Nielsen began keeping track in 1949-1950: Americans watched 1 hour and 23 minutes more television at the end of the decade than at the beginning. Run the numbers and you’ll find that 32 percent of the increase in viewing time from the birth of television to its peak occurred in the first years of the 21st century.

Over the last 8 years, all the new, non-TV things — Facebook, phones, YouTube, Netflix — have only cut about an hour per day from the dizzying amount of TV that the average household watches. Americans are still watching more than 7 hours and 50 minutes per household per day.

YouTube, the Great Radicalizer

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.

Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot.

Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.

The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.

It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident.

YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax.

What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube.

This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

YouTube, YouTubers and You