Resources

YouTube’s Recommender AI Still a Horror Show, Finds Major Crowdsourced Study

For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy junk/disinformation for the profiteering motive of trying to keep billions of eyeballs stuck to its ad inventory. And while YouTube’s tech giant parent Google has, sporadically, responded to negative publicity flaring up around the algorithm’s antisocial recommendations — announcing a few policy tweaks or limiting/purging the odd hateful account — it’s not clear how far the platform’s penchant for promoting horribly unhealthy clickbait has actually been rebooted. The suspicion remains nowhere near far enough.

New research published today by Mozilla backs that notion up, suggesting YouTube’s AI continues to puff up piles of “bottom-feeding”/low-grade/divisive/disinforming content — stuff that tries to grab eyeballs by triggering people’s sense of outrage, sewing division/polarization or spreading baseless/harmful disinformation — which in turn implies that YouTube’s problem with recommending terrible stuff is indeed systemic; a side effect of the platform’s rapacious appetite to harvest views to serve ads. That YouTube’s AI is still — per Mozilla’s study — behaving so badly also suggests Google has been pretty successful at fuzzing criticism with superficial claims of reform. The mainstay of its deflective success here is likely the primary protection mechanism of keeping the recommender engine’s algorithmic workings (and associated data) hidden from public view and external oversight — via the convenient shield of “commercial secrecy.” But regulation that could help crack open proprietary AI blackboxes is now on the cards — at least in Europe.

325

YouTube Stars Were Offered Money to Spread Vaccine Misinformation

“A mysterious marketing agency secretly offered to pay social media stars to spread disinformation about Covid-19 vaccines,” reports the BBC.

“Their plan failed when the influencers went public about the attempt to recruit them.”
An influencer marketing agency called Fazze offered to pay [Mirko Drotschmann, a German YouTuber and journalist] to promote what it said was leaked information that suggested the death rate among people who had the Pfizer vaccine was almost three times that of the AstraZeneca jab. The information provided wasn’t true. It quickly became apparent to Mirko that he was being asked to spread disinformation to undermine public confidence in vaccines in the middle of a pandemic. “I was shocked,” says Mirko “then I was curious, what’s behind all that?” In France, science YouTuber Léo Grasset received a similar offer. The agency offered him 2000 euros if he would take part.

Fazze said it was acting for a client who wished to remain anonymous…

Both Léo and Mirko were appalled by the false claims. They pretended to be interested in order to try to find out more and were provided with detailed instructions about what they should say in their videos. In stilted English, the brief instructed them to “Act like you have the passion and interest in this topic.” It told them not to mention the video had a sponsor — and instead pretend they were spontaneously giving advice out of concern for their viewers… Since Léo and Mirko blew the whistle at least four other influencers in France and Germany have gone public to reveal they also rejected Fazze’s attempts to recruit them.

But German journalist, Daniel Laufer, has identified two influencers who may have taken up the offer.

But who’s behind the mysterious influencer marketing agency?
Fazze is a part of AdNow, which is a digital marketing company, registered in both Russia and the UK. The BBC has made multiple attempts to contact AdNow by phone, email and even a letter couriered to their Moscow headquarters, but they have not responded. Eventually we managed to contact Ewan Tolladay, one of two directors of the British arm of AdNow – who lives in Durham. Mr. Tolladay said he had very little to do with Fazze — which he said was a joint venture between his fellow director — a Russian man called Stanislav Fesenko — and another person whose identity he didn’t know… Both the French and German authorities have launched investigations into Fazze’s approaches to influencers. But the identity of the agency’s mystery client remains unclear.

There has been speculation about the Russian connections to this scandal and the interests of the Russian state in promoting its own vaccine — Sputnik V.

French YouTuber Léo Grasset believes we’ll see more attempts to manipulate public opinion, especially young people — apparently because it’s incredibly easy.

“Just spend the same money on TikTok creators, YouTube creators,” they tell the BBC. “The whole ecosystem is perfectly built for maximum efficiency of disinformation right now.”

335

New Survey Reveals Teens Get Their News from Social Media and YouTube

Celebrities, influencers, and personalities have as much influence as a source of current events as friends, family, and news organizations.

Teens today are not only getting the majority of their news online, but they are turning away from traditional media organizations to find out about current events on social media sites and YouTube, often from online influencers and celebrities, according to a new poll by Common Sense and SurveyMonkey.

The survey found that more than half of teens (54%) get news at least a few times a week from social media platforms such as Instagram, Facebook, and Twitter and 50% get news from YouTube.

Teens’ news habits reflect the diversity of the modern media landscape. And, while most news organizations maintain accounts on social media and other platforms, they are competing for attention with corporate brands, celebrities, influencers, and personal connections. Of those teens who get their news from YouTube, for example, six in 10 say they are more likely to get it from celebrities, influencers, and personalities rather than from news organizations utilizing the platform.

What’s noteworthy is that, even with so many relying on alternative sources for the majority of their news, teens are more confident in the news they get directly from news organizations. Of teens who get news of current events from news organizations, 65% say it helps them better understand what is going on. In contrast, just 53% of teens who get news from social media say it helps them better understand what is going on, while 19 percent say it has made them more confused about current events.

Amid ongoing concerns about the impact of information disseminated through social media on elections, older teens’ news habits may have political implications. Of the teens age 16 and 17 who say they’ll be eligible to vote in the 2020 election, 85% are likely to cast a ballot, including 61% who say they’re “very likely.”

“These findings raise concerns about what kind of news the next generation is using to shape their decisions,” said James Steyer, CEO of Common Sense. “There are few standards for what constitutes news and how accurately it’s portrayed on the platforms teens use. With the 2020 election coming up, we need to make sure teens are getting their news from reliable sources, thinking critically, and making informed decisions.”

This latest survey is part of a Common Sense partnership with SurveyMonkey to examine media and technology trends affecting kids and their parents and to share actionable data and insights with families.

“While it’s notable that teens rely heavily on platforms such as Facebook and YouTube to stay informed, their reliance on news from celebrities and influencers rather than journalists may have pernicious implications,” said Jon Cohen, chief research officer at SurveyMonkey. “It’s a bit of a paradox: Overwhelmingly teens say they are interested in keeping up with the news, but they’re not seeking out either traditional or new media to do so.”

Selected key findings

  1. A large majority of teens age 13 to 17 in the U.S. (78%) say it’s important to them to follow current events.
  2. Teens get their news more frequently from social media sites (e.g., Facebook and Twitter) or from YouTube than directly from news organizations. More than half of teens (54%) get news from social media, and 50% get news from YouTube at least a few times a week. Fewer than half, 41%, get news reported by news organizations in print or online at least a few times a week, and only 37% get news on TV at least a few times a week.
  3. YouTube recommendations drive news consumption. Among all teens who get their news from YouTube—regardless of how often—exactly half (50%) say they most often find news on YouTube because it was recommended by YouTube itself (i.e., as a “watch next” video or in the sidebar). Almost half as many (27%) say they follow or subscribe to a specific channel for news on YouTube, and fewer say they find their news on YouTube through search (10%) or because it was shared by someone they know in real life (7%).
  4. Sixty percent of teens who get news from YouTube say they are more likely to get it from celebrities, influencers, and personalities as compared to news organizations (39%). The difference is even more apparent among daily YouTube news consumers (71% vs. 28%).
  5. Nearly two in three teens (65%) who get news directly from news organizations say doing so has helped them better understand current events, compared with 59% of teens who get their news from YouTube (56%) and 53% who get their news from social media sites (53%). Nearly two in 10 teens (19%) say that getting news from social media has made them more confused about current events.
  6. Teens clearly prefer a visual medium for learning about the news. A majority (64%) say that “seeing pictures and video showing what happened” gives them the best understanding of major news events, while just 36% say they’d prefer to read or hear the facts about what happened.
  7. Politically, teens are more likely to be moderate and identify as Democrats, but they are open to ideas from sources whose opinions differ from their own. Just under half of teens (45%) say they get news from sources that have views different from their own once a week or more, and only 14% say they never get news from sources with different views. Slightly fewer (35%) say they discuss political issues with people who have different views once a week or more, and 19% say they never discuss politics with people who have opposing views.

The study comes on the heels of the release of Common Sense’s revamped Digital Citizenship Curriculum, which gives teachers lessons to help students develop skills to be critical consumers of news at a time when they are navigating a fast-changing digital terrain fraught with fake media, hate speech, cyberbullying, and constant digital distraction.

Methodology: This SurveyMonkey Audience survey was conducted June 14 to 25, 2019, among 1,005 teenagers age 13 to 17 in the United States. Respondents for these surveys were selected from more than 2 million people who take surveys on the SurveyMonkey platform each day. The modeled error estimate for the full sample is +/-4.0 percentage points. Data has been weighted for age and sex using the Census Bureau’s American Community Survey to reflect the demographic composition of people in the United States age 13 to 17. Find the full survey results and more information about Common Sense research here.

469

YouTube’s Top Earner For 2019? An 8-Year-Old Who Made $26M

“An eight-year-old boy who reviews toys on YouTube has been named by Forbes as the platform’s highest earner in 2019,” reports CNN:
Ryan Kaji, whose channel Ryan’s World has 22.9 million subscribers, earned $26 million in 2019 — up $4 million from his earnings in 2018, when he also gained the highest-earning YouTuber spot… Another child, Anastasia Radzinskaya, five, came in third place with earnings of $18 million. Radzinskaya, who was born in southern Russia and has cerebral palsy, appears in videos with her father. According to Forbes, she has 107 million subscribers across seven channels and her videos have been watched 42 billion times….

Dude Perfect — a group of five friends in their thirties who play sports and perform stunts — came in second place, earning $20 million.

YouTube has announced that next year it will stop personalized advertisements on children’s content. This comes after Google agreed to pay $170 million to settle accusations that YouTube broke the law when it knowingly tracked and sold ads targeted to children.

533

YouTube’s Algorithm Made Fake CNN Reports Go Viral

“YouTube channels posing as American news outlets racked up millions of views on false and inflammatory videos over several months this year,” reports CNN.

“All with the help of YouTube’s recommendation engine.”

Many of the accounts, which mostly used footage from CNN, but also employed some video from Fox News, exploited a YouTube feature that automatically creates channels on certain topics. Those topic channels are then automatically populated by videos related to the topic — including, in this case, blatant misinformation.

YouTube has now shut down many of the accounts.

YouTube’s own algorithms also recommended videos from the channels to American users who watched videos about U.S. politics. That the channels could achieve such virality — one channel was viewed more than two million times over one weekend in October — raises questions about YouTube’s preparedness for tackling misinformation on its platform just weeks before the Iowa caucuses and points to the continuing challenge platforms face as people try to game their systems….

Responding to the findings on Thursday, a CNN spokesperson said YouTube needs to take responsibility.

“When accounts were deleted or banned, they were able to spin up new accounts within hours,” added Plasticity, a natural language processing and AI startup which analyzed the data and identified at least 25 different accounts which YouTube then shut down.

“The tactics they used to game the YouTube algorithm were executed perfectly. They knew what they were doing.”

534

Doctors Are Turning To YouTube To Learn How To Do Surgical Procedures

Some doctors say that medical students and residents are turning to YouTube to fill in gaps in their training. The video-sharing platform hosts tens of thousands of surgery-related videos, and the number keeps climbing every year.

People have livestreamed giving birth and broadcast their face-lifts. One video, which shows the removal of a dense, white cataract, has gone somewhat viral and now has more than 1.7 million views. Others seem to have found crossover appeal with nonmedical viewers, such as a video from the U.K.-based group Audiology Associates showing a weirdly satisfying removal of a giant glob of earwax. Doctors are uploading these videos to market themselves or to help others in the field, and the amount is growing by leaps and bounds. Researchers in January found more than 20,000 videos related to prostate surgery alone, compared with just 500 videos in 2009.

The videos are a particular boon for doctors in training. When the University of Iowa surveyed its surgeons, including its fourth-year medical students and residents, it found that YouTube was the most-used video source for surgical preparation by far. But residents and medical students are not the only ones tuning in. Experienced doctors, like Stanford Hospital’s vascular surgeon Dr. Oliver Aalami said he turned to YouTube recently ahead of a particularly difficult exposure. There’s one problem with this practice that will be familiar to anybody who’s searched YouTube for tips on more mundane tasks like household repairs. How can doctors tell which videos are valid and which contain bogus information?

“[O]ne recent study found more than 68,000 videos associated with a common procedure known as a distal radius fracture immobilization,” the report adds. “The researchers evaluated the content for their technical skill demonstrated and educational skill, and created a score. Only 16 of the videos even met basic criteria, including whether they were performed by a health-care professional or institution. Among those, the scores were mixed. In several cases, the credentials of the person performing the procedure could not be identified at all.”

Other studies are finding that YouTube’s algorithm is highly ranking videos where the technique isn’t optimal.

540

Mozilla is Sharing YouTube Horror Stories To Prod Google For More Transparency

Mozilla is publishing anecdotes of YouTube viewing gone awry — anonymous stories from people who say they innocently searched for one thing but eventually ended up in a dark rabbit hole of videos. It’s a campaign aimed at pressuring Google’s massive video site to make itself more accessible to independent researchers trying to study its algorithms. “The big problem is we have no idea what is happening on YouTube,” said Guillaume Chaslot, who is a fellow at Mozilla, a nonprofit best known for its unit that makes and operates the Firefox web browser.

Chaslot is an ex-Google engineer who has investigated YouTube’s recommendations from the outside after he left the company in 2013. (YouTube says he was fired for performance issues.) “We can see that there are problems, but we have no idea if the problem is from people being people or from algorithms,” he said….

Mozilla is publishing 28 stories it’s terming #YouTubeRegrets; they include, for example, an anecdote from someone who who said a search for German folk songs ended up returning neo-Nazi clips, and a testimonial from a mother who said her 10-year-old daughter searched for tap-dancing videos and ended up watching extreme contortionist clips that affected her body image.

596

YouTube Gets Alleged Copyright Troll To Agree To Stop Trolling YouTubers

Alleged copyright troll Christopher Brady will no longer be able to issue false DMCA takedowns to other YouTubers, according to a lawsuit settlement filed today. The Verge reports:

Under the new agreement, Brady is banned from “submitting any notices of alleged copyright infringement to YouTube that misrepresent that material hosted on the YouTube service is infringing copyrights held or claimed to be held by Brady or anyone Brady claims to represent.” Brady agreed to pay $25,000 in damages as part of the settlement. He is also prohibited from “misrepresenting or masking their identities” when using Google products, including YouTube. “This settlement highlights the very real consequences for those that misuse our copyright system. We’ll continue our work to prevent abuse of our systems,” a YouTube spokesperson told The Verge.

“I, Christopher L. Brady, admit that I sent dozens of notices to YouTube falsely claiming that material uploaded by YouTube users infringed my copyrights,” he said in an apology, shared by YouTube with The Verge. “I apologize to the YouTube users that I directly impacted by my actions, to the YouTube community, and to YouTube itself.” YouTube claimed the investigation caused the company to “expend substantial sums on its investigation in an effort to detect and halt that behavior, and to ensure that its users do not suffer adverse consequences from it.” YouTube also said that the company may be “unable to detect and prevent similar misconduct in the future,” as a result of the various methods Brady took to cover up his identity.

558

YouTube is Experimenting With Ways To Make Its Algorithm Even More Addictive

While YouTube has publicly said that it’s working on addressing problems that are making its website ever so addictive to users, a new paper from Google, which owns YouTube, seems to tell a different story.

It proposes an update to the platform’s algorithm that is meant to recommend even more targeted content to users in the interest of increasing engagement. Here’s how YouTube’s recommendation system currently works. To populate the recommended-videos sidebar, it first compiles a shortlist of several hundred videos by finding ones that match the topic and other features of the one you are watching. Then it ranks the list according to the user’s preferences, which it learns by feeding all your clicks, likes, and other interactions into a machine-learning algorithm. Among the proposed updates, the researchers specifically target a problem they identify as “implicit bias.” It refers to the way recommendations themselves can affect user behavior, making it hard to decipher whether you clicked on a video because you liked it or because it was highly recommended. The effect is that over time, the system can push users further and further away from the videos they actually want to watch.

To reduce this bias, the researchers suggest a tweak to the algorithm: each time a user clicks on a video, it also factors in the video’s rank in the recommendation sidebar. Videos that are near the top of the sidebar are given less weight when fed into the machine-learning algorithm; videos deep down in the ranking, which require a user to scroll, are given more. When the researchers tested the changes live on YouTube, they found significantly more user engagement. Though the paper doesn’t say whether the new system will be deployed permanently, Guillaume Chaslot, an ex-YouTube engineer who now runs AlgoTransparency.org, said he was “pretty confident” that it would happen relatively quickly.

582

Politicians Can Break Our Content Rules, YouTube CEO Says

YouTube CEO Susan Wojcicki said this week that content by politicians would stay up on the video-sharing website even if it violates the company’s standards, echoing a position staked out by Facebook this week.

“When you have a political officer that is making information that is really important for their constituents to see, or for other global leaders to see, that is content that we would leave up because we think it’s important for other people to see,” Wojcicki told an audience at The Atlantic Festival this morning. Wojcicki said the news media is likely to cover controversial content regardless of whether it’s taken down, giving context to understand it. YouTube is owned by Google. A YouTube spokesperson later told POLITICO that politicians are not treated differently than other users and must abide by its community guidelines. The company grants exemptions to some political speech if the company considers it to be educational, documentary, scientific, or artistic in nature.

562

Parents Are Spending Thousands On YouTube Camps That Teach Kids How To Be Famous

Various YouTube summer camps have begun launching across the nation, designed to turn regular elementary and middle-school-aged children into bonfire internet sensations. Per a recent report from the Wall Street Journal, parents are spending nearly $1,000 dollars a week for their children to learn how to create branded social media-related content. Though YouTube is not affiliated with or in any communication with any summer program, such camps are on the rise, and parents with means have made them a thing.

One summer camp gaining traction is YouTube STAR Creator Studio. Located in Culver City, California, its website states that it “branches out from traditional storytelling to how to create the fun and hilarious content that kids love to watch.” The camp is designed for those in first through sixth grade, according to the website, and charges $375 dollars a week. Another prominent company is Level Up, which, according to the organization, became the first company in North America to offer YouTube classes and camps when it opened five years ago. Level Up takes an educational approach toward the platform to attract kids who “want to learn how to create an awesome YouTube channel,” and promises that the class will give students the “skills to create engaging videos.” The topics covered in Level Up’s the summer camp range from learning how to interview people, draft storyboard ideas, and source and sync audio files.

Despite the rise in programs, many parents interviewed in the Wall Street Journal dismissed the idea of being a “YouTube star,” believing it as nothing more than a hobby for their young children. [Camp director] disagrees, believing that being a YouTuber is a viable career path for the next generation.

“YouTube is now not only the preferred source of entertainment for kids, but it is also now their preferred career choice.”

619

YouTube’s Biggest Stars Are Pushing a Shady Polish Gambling Site

Untold riches are promised on Mystery Brand, a website that sells prize-filled “mystery boxes.” If you buy one of the digital boxes, some of which cost hundreds of dollars, you might only get a fidget spinner — or you might get a luxury sports car. For just $100, users can win a box filled with rare Supreme streetwear. For only $12.99, they can win a Lamborghini, or even a $250 million mega-mansion billed as “the most expensive Los Angeles realty.” Or at least that’s what some top YouTubers have been telling their young fans about the gambling site — with the video stars apparently seeing that as a gamble worth taking, especially after a dip in YouTube advertising rates.

Over the past week, hugely popular YouTube stars like Jake Paul and Bryan “Ricegum” Le have encouraged their fans to spend money on Mystery Brand, a previously little-known site that appears to be based in Poland. In their videos, Paul and Le show themselves betting hundreds of dollars on the site for a chance to open a digital “box.” At first, they win only low-value prizes like fidget spinners or Converse sneakers. By the end of the video, though, they have won thousands of dollars worth of tech and clothing, like rare pairs of sneakers or Apple AirPods. If they like the prize, the YouTube stars have it shipped to their house.
The gambling site doesn’t list the owner or location where it’s based, although the site’s terms of service say it’s “subject to the laws and jurisdiction of Poland.” To make matters worse, users of the site might not even receive the items they believed they have won. “During using the services of the website You may encounter circumstances in which Your won items will not be received,” the terms of service reads.

Also, while the ToS say that underage users are ineligible to receive prizes, many of the YouTubers promoting the site have audiences who are underage. “[Jake Paul], for example, has acknowledged that the bulk of his fanbase is between 8 and 15 years old,” reports The Daily Beast.

652

“How YouTube’s Year-in-Review ‘Rewind’ Video Set Off a Civil War”

You might guess that a surefire way to make a hit video on YouTube would be to gather a bunch of YouTube megastars, film them riffing on some of the year’s most popular YouTube themes and release it as a year-in-review spectacular. You would be wrong.

The issue that upset so many YouTube fans, it turns out, was what the Rewind video did not show. To many, it felt like evidence that YouTube the company was snubbing YouTube the community by featuring mainstream celebrities in addition to the platform’s homegrown creators, and by glossing over major moments in favor of advertiser-friendly scenes.

If YouTube had been trying to create an accurate picture of its platform’s most visible faces, it would need to include bigots, reactionaries and juvenile shock jocks. A YouTube recap that includes only displays of tolerance and pluralism is a little like a Weather Channel highlight reel featuring only footage of sunny days — it might be more pleasant to look at, but it doesn’t reflect the actual weather.

658

YouTube’s Top-Earner For 2018 Is a 7-Year-Old

In 2018 the most-downloaded iPhone app was YouTube, reports USA Today, while Amazon’s best-selling item was their Fire TV Stick for streaming video. The No. 1 earner on YouTube this year is 7-year-old Ryan. For all those unboxing videos and playing with toys — and his own new line of toys at Walmart — he and his family will pull in a cool $22 million, according to Forbes. Ryan launched the channel in 2015 — when he was four — and now has 17.3 million followers.

689

A Look at the Dark Side of the Lives of Some Prominent YouTubers, Who Are Increasingly Saying They’re Stressed, Depressed, Lonely, and Exhausted

Many YouTubers are finding themselves stressed, lonely and exhausted. For years, YouTubers have believed that they are loved most by their audience when they project a chirpy, grateful image. But what happens when the mask slips? This year there has been a wave of videos by prominent YouTubers talking about their burnout, chronic fatigue and depression. “This is all I ever wanted,” said Elle Mills, a 20-year-old Filipino-Canadian YouTuber in a (monetised) video entitled Burnt Out At 19, posted in May. “And why the fuck am I so unfucking unhappy? It doesn’t make any sense. You know what I mean? Because, like, this is literally my fucking dream. And I’m fucking so un-fucking-happy.”

… The anxieties are tied up with the relentless nature of their work. Tyler Blevins, AKA Ninja, makes an estimated $500,000 every month via live broadcasts of him playing the video game Fortnite on Twitch, a service for livestreaming video games that is owned by Amazon. Most of Blevins’ revenue comes from Twitch subscribers or viewers who provide one-off donations (often in the hope that he will thank them by name “on air”). Blevins recently took to Twitter to complain that he didn’t feel he could stop streaming. “Wanna know the struggles of streaming over other jobs?” he wrote, perhaps ill-advisedly for someone with such a stratospheric income. “I left for less than 48 hours and lost 40,000 subscribers on Twitch. I’ll be back today… grinding again.” There was little sympathy on Twitter for the millionaire. But the pressure he described is felt at every level of success, from the titans of the content landscape all the way down to the people with channels with just a few thousand subscribers, all of whom feel they must be constantly creating, always available and responding to their fans.

At the end of the month he was pale, gaunt and tired in a way that, he recalls, seemed “impervious to rest”. His work, he noticed, had become increasingly rushed and harsh in tone. Yet the angry, provocative quality of his videos seemed only to be making them more popular. “Divisive content is the king of online media today, and YouTube heavily boosts anything that riles people up,” he says. “It’s one of the most toxic things: the point at which you’re breaking down is the point at which the algorithm loves you the most.”

“Constant releases build audience loyalty,” says Austin Hourigan, who runs ShoddyCast, a YouTube channel with 1.2 million subscribers. “The more loyalty you build, the more likely your viewers are to come back, which gives you the closest thing to a financial safety net in what is otherwise a capricious space.” When a YouTuber passes the 1 million subscribers mark, they are presented with a gold plaque to mark the event. Many of these plaques can be seen on shelves and walls in the background of presenters’ rooms. In this way, the size of viewership and quantity of uploads become the main markers of value.

663

Top YouTube creators burn out mentally

Three weeks ago, Bobby Burns, a YouTuber with just under one million subscribers, sat down on a rock in Central Park to talk about a recent mental health episode. One week ago, Elle Mills, a creator with more than 1.2 million subscribers, uploaded a video that included vulnerable footage during a breakdown. Six days ago, Rubén “El Rubius” Gundersen, the third most popular YouTuber in the world with just under 30 million subscribers, turned on his camera to talk to his viewers about the fear of an impending breakdown and his decision to take a break from YouTube.

Burns, Mills and Gundersen aren’t alone. Erik “M3RKMUS1C” Phillips (four million subscribers), Benjamin “Crainer” Vestergaard (2.7 million subscribers) and other top YouTubers have either announced brief hiatuses from the platform, or discussed their own struggles with burnout, in the past month. Everyone from PewDiePie (62 million subscribers) to Jake Paul (15.2 million subscribers) have dealt with burnout. Lately, however, it seems like more of YouTube’s top creators are coming forward with their mental health problems.

Constant changes to the platform’s algorithm, unhealthy obsessions with remaining relevant in a rapidly growing field and social media pressures are making it almost impossible for top creators to continue creating at the pace both the platform and audience want — and that can have a detrimental effect on the very ecosystem they belong to.

698

Screen watching at all-time high

With Netflix and Amazon Prime, Facebook Video and YouTube, it’s tempting to imagine that the tech industry destroyed TV. The world is more than 25 years into the web era, after all, more than half of American households have had home Internet for 15 years, and the current smartphone paradigm began more than a decade ago. But no. Americans still watch an absolutely astounding amount of traditional television.

In fact, television viewing didn’t peak until 2009-2010, when the average American household watched 8 hours and 55 minutes of TV per day. And the ’00s saw the greatest growth in TV viewing time of any decade since Nielsen began keeping track in 1949-1950: Americans watched 1 hour and 23 minutes more television at the end of the decade than at the beginning. Run the numbers and you’ll find that 32 percent of the increase in viewing time from the birth of television to its peak occurred in the first years of the 21st century.

Over the last 8 years, all the new, non-TV things — Facebook, phones, YouTube, Netflix — have only cut about an hour per day from the dizzying amount of TV that the average household watches. Americans are still watching more than 7 hours and 50 minutes per household per day.

648

YouTube, the Great Radicalizer

At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with.

Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.

It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.

Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot.

Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.

The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos.

It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro-Clinton content, and YouTube’s tendency toward the incendiary seems evident.

YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax.

What we are witnessing is the computational exploitation of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

This situation is especially dangerous given how many people — especially young people — turn to YouTube for information. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre-college laptop education market in the United States, typically come loaded with ready access to YouTube.

This state of affairs is unacceptable but not inevitable. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

836

YouTube, YouTubers and You

739