Advertisement

Logan Paul forced YouTube to admit humans are better than algorithms

It took a video of a dead body for it to finally reconsider its approach.

PG/Bauer-Griffin via Getty Images

YouTube is no stranger to controversy. Many of its top stars have been in hot water recently: From PewDiePie making racists remarks, to a "family" channel with abusive kid pranks, the company's been under fire for not keeping a closer eye on the type of content that makes it onto the site. Most recently, Logan Paul, a popular YouTuber with more than 15 million subscribers, faced backlash after posting a video that showed a corpse he came across in Japan's so-called "Suicide Forest." That clip, which was eventually taken down by Paul himself, forced YouTube to cut almost all ties with him and to figure out ways to prevent another situation like this.

Up until now, Google's (and by extension YouTube's) solution had been to take down offensive channels and tweak its advertiser-friendly guidelines to give brands more control over where their ads show up. But the tech giant is now taking that one step further. Earlier this week, it announced YouTube will now manually review uploads from accounts that are part of its Google Preferred ad tier, which lets brands publish advertisements in videos from the top five percent of YouTube creators.

The shift is notable because it means YouTube will rely less on algorithms to catch bad actors, something that social media companies are finally realizing needs to happen. Facebook and Twitter have both also vowed to hire more humans, as they look to crack down on bots and troll accounts that have plagued their sites. What Google and YouTube hope, naturally, is that this will help avoid another mess like the one Logan Paul created.

Mobile Technology Applications

Although Paul's channel "Logan Paul Vlogs" still lives on the platform, YouTube has put on hold the original projects he was working on for YouTube Red, its paid ad-free streaming service. It also terminated his lucrative Google Preferred ad deal, and while he will still be able to monetize his content, not being a part of that advertising package likely won't earn him nearly as much money. For context, he was reportedly the fourth highest-paid YouTuber in 2017, according to Forbes, earning an estimated $12.5 million -- thanks to Preferred, his Maverick apparel line and sponsored posts on social media.

The decision was likely a tough one for YouTube, considering the millions of people who watch Logan Paul's channel and, perhaps most importantly, the level of influence he has over a key demographic: teenagers. But YouTube had to make an example out of him in order to appease advertisers, which grow more and more concerned that their ads could appear alongside disturbing or inappropriate videos. Last year, AT&T and Verizon (which owns Engadget), among others, pulled ads from Google's platform after they were displayed on videos related to terrorism and hate groups.

YouTube is also implementing stricter requirements for its Partner Program, which lets smaller channels earn money by placing ads in their videos, to help filter out offensive content. Creators can now only become YouTube Partners if they have 4,000 hours of watchtime in the past 12 months and over 1,000 subscribers. These changes are in addition to the ones made in 2017, when YouTube began requiring 10,000 channel views minimum in order to be granted partnership status. The company says setting these thresholds will prevent low-quality videos from making money and stop channels from uploading stolen content. That said, it still plans to depend heavily on viewers flagging videos that may violate YouTube's community guidelines.

PewDiePie Signs Copies Of His New Book 'This Book Loves You'

YouTuber star "PewDiePie"

The main challenge for YouTube is that often it is top users who are uploading dubious content, not the smaller channels. And that begs the question of why it took it so long to act, at least in a tougher manner. It's not as if YouTube hasn't dealt with cases similar to Logan Paul's in the past. Take Felix "PewDiePie" Kjellberg as an example, the Swedish YouTuber with nearly 60 million subscribers who has published videos filled with anti-Semitic and other racist outbursts on more than one occasion. Or the channel "Toy Freaks," which had over 8 million subscribers and featured explicit content targeted at young audiences, including videos of children vomiting and in extreme pain that it claimed were "pranks."

Granted, YouTube did act quickly in both cases: PewDiePie lost his original series Scare PewDiePie and Google Preferred deal, similar to Logan Paul, while the Toy Freaks channel was removed altogether. But those acts should've been a huge flag that the company needed to take a hard look at itself and change its video-review process, from depending less on machine learning and more on humans. Just as it plans to do going forward.

If the new system would've been in place, chances are the controversial Logan Paul video may have never been viewed by the masses and, therefore, YouTube could've saved itself from major public outcry. In fact, there's still an ongoing petition calling for his channel to be deleted, which so far has been signed by more than half a million people. The hope for YouTube now is that, by having humans monitor popular uploads, there will be less of a chance of any foul videos being published in the future. A YouTube spokesperson told Engadget that every decision the company makes has to work for advertisers, creators and users alike, which can be complicated because not every situation is black and white.

With the overhauled YouTube Partner Program, for example, some creators aren't happy with the new requirements because they don't think they'll be able to make money. But YouTube says that of those channels that will be affected, 99 percent are making less than $100 per year. Ultimately, the spokesperson said, all the changes made recently, both to the advertising and community guidelines, are designed to "move everyone forward," adding that YouTube doesn't want someone's bad judgment to affect the rest of the platform -- even though it certainly feels like it is.

Paul Levinson, a professor of communications and media studies at Fordham University, said he has mixed feelings about the decisions YouTube is making. He believes that, by censoring its creators, the site will lose the freedom that's made it the most popular video site in the world. That said, Levinson also understands that it isn't appropriate to have a corpse or other "disgusting" content in a video.

"Of course, you could argue that if someone doesn't like it, they don't have to view it," he said. "You know, they can just shut it off the second they see it, but obviously, I get why people find that offensive, even repulsive. And so in that sense, it's a good thing, but at the same time I'm concerned that we're beginning to see the end of that totally open [internet]."

Now, as we move past Logan Paul's controversy, it'll be interesting to see how effective YouTube's new monitoring system will be, and whether it decides to expand it beyond just the top five percent of videos. But don't be surprised if some manage to slip through the cracks, because like the algorithms that have failed YouTube in the past, humans are also far from perfect.

Images: Getty Images (All)