Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on July 23, 2019

[Best of 2019] ‘YouTubers’ author Chris Stokel-Walker on how to fix the world’s biggest video platform


[Best of 2019] ‘YouTubers’ author Chris Stokel-Walker on how to fix the world’s biggest video platform

Over the past couple of years, YouTube’s been in hot water over a myriad of issues that would give any CEO sleepless nights. The list of offenses includes failing to police problematic videos appearing on its platform, toxic comments that have pushed it to disable the feature for some content, and troubling recommendations delivered to unwitting users, including children.

Naturally, the company has been lambasted over these shortcomings – rightly so, many people would argue. I’ve been on that side of the debate too, having written about the platform’s mistakes over the past few years.

That said, I began to wonder if we weren’t collectively being too harsh on a platform that does something impossibly remarkable. It lets more than two billion people around the world watch video that’s uploaded at a rate of 500 hours’ worth of content per minute, on demand.

To get to the bottom of this, I spoke to Chris Stokel-Walker, who’s literally written the book on the subject. YouTubers, which hit shelves in May, looks at the lives of prominent content creators who make a living on the platform, as well as its impact on society at large. Stokel-Walker decided to write it after covering YouTube for years as a freelance reporter, and coming up empty in his search for an authoritative source of information on the site’s 14 year-long history.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Stokel-Walker had a lot to say about the platform, and how it’s viewed in the media. In a conversation that began on Twitter and then expanded into a full-blown interview, I started with a simple question: how realistic is it to expect that a company can consistently and efficiently enforce its content guidelines for all its users?

“Obviously they’re not going to be perfect ever,” Stokel-Walker said. “That’s part of the challenge. But The Guardian’s UK technology editor, Alex Hern, made a really good point some time ago (click for full-size image). He basically asked, why can’t Youtube actually just hire more human staff?”

In now-deleted tweets (he does that often), Hern did a back-of-the-napkin calculation that took into account about 100,000 people working eight-hour shifts to moderate 400 hours of YouTube content being uploaded every minute. Given London’s living wage of about £20,000 ($25,000) a year, that works out to roughly £2 billion ($2.5 billion) a year, which isn’t terrible considering that YouTube’s parent company, Alphabet, made profits of about $30 billion last year.

“You know, they talk about the fact that they have 10,000 human moderators and they have an automated monitoring monitoring system that represents the collective brain power of another 180,000 people, which is fair enough. But the ultimate fact is that we are seeing a shift in terms of governments’ response to this, where they want to see platforms become far more interventionists and proactive,” said Stokel-Walker.

The author noted that he’s still a tech journalist, and still believes there is good in tech, and  interventionists can be troublesome. “But YouTube gives a platform to a lot of extremists who could radicalize others, and enables them to reach many people. So there does need to be, I think, some responsibility from the platforms to kind of be aware of their power into what impact they can have on people’s mindsets,” he said.

Chris Stokel-Walker, author of 'YouTubers'
Chris Stokel-Walker, author of ‘YouTubers’

Stokel-Walker often cites the example that nobody sets up a video sharing website in 2005 to upload footage or elephants at the zoo thinking that they go to become a platform that is basically seen by most of the world’s internet-connected population. But at some point along the way, he says, YouTube must have recognized the power that it has over society.

“The reason that they have to recognize that is that journalists have recognized that, and journalists don’t have access to the reams of data that YouTube and YouTube’s engineers do to try and drive engagement and keep them on the site. So if we as viewers can see the issues coming for this platform, warn YouTube about it, and ask the company to do something about it, I don’t see why YouTube itself can’t,” says Stokel-Walker.

So where does the problem actually lie? And is this ‘problem’ part of the way YouTube is built, or is it simply difficult to solve? A big part of the platform is its algorithm for recommending content, which is tuned to increase viewer engagement on the site.

Stokel-Walker said that the algorithm design is “a black box up to a certain extent.” There’s only partial information on how it works, from academic papers that YouTube has put out. Stokel-Walker explained that it’s designed to prioritize watch time:

“Its goal is to try and extend the amount of time that you spend watching videos on the platform,” he said. “Basically, it looks at things like your search history, where you live, your gender, what kind of device you’re watching, and then it kind of narrows down its recommendation engine based on what it thinks will keep you watching.”

And that’s just one part of the story. Some creators are, in a way, giving into the algorithm with their content too. “It’s kind of a symbiotic relationship,” Stokel-Walker explained. “Humans like watching interesting things. So if you want to try and increase watch time, you need to grab viewers’ attention. And if you want to grab viewer’s attention, you kind of need to do more extreme things. Whether that’s extreme things in terms of doing a stunt, something like climbing to the top of a skyscraper and recording yourself doing it, or whether it’s putting forward conspiracy theories that will grab viewers’ attention.”

“So I guess it’s kind of human nature in a way,” he said. “The algorithm is, in many ways, an attempt to try and hold a mirror up to society. But what it does is that it sort of reflects back and amplifies some of the worst bits of us.”

That’s similar to how just about any of us will slow down on the road to get a better look at a car crash, even though it’s not exactly a pleasant sight.

“Now the interesting thing from talking to Guillaume Chaslot, who was an engineer on the algorithm, is that the algorithm and the engineers behind it treat it in a different way potentially than many viewers do,” said Stokel-Walker.

Credit: DisinfoLab
Guillaume Chaslot speaking at the DisinfoLab Conference in Brussels

He noted that YouTube only hires super smart people – MIT and Stanford graduates and such, and those are the people creating the algorithm. So if the algorithm promotes, say, extreme conspiracy theory videos, which Chaslot’s work has shown it does, “then the engineers who see that are inclined to believe these clips are being promoted because people are watching them as entertainment. They’re inclined to believe that people will see these videos as patently stupid, and they’ll get a laugh out of it,” explained Stokel-Walker.

“But the risk is that not everybody that uses YouTube has the media literacy to differentiate between real news and fake news. So I think there is an underestimation of the algorithm’s impact, with a simultaneous overestimation of the media literacy and sometimes the intelligence of viewers,” he noted.

On the surface, that sounds like a problem that smart people could solve if they put their heads together – but it’s probably not quite so simple at the scale YouTube operates, with so much content, so many independent creators, and billions of viewers.

Stokel-Walker argued that this isn’t solely due to the difficulty of solving these issues, or only the result of YouTube busying itself too much with efforts to increase engagement:

“I think that YouTube probably does try to get ahead of these issues. But also I think that it is inherently a reactive rather than a proactive company. And it only answers really to advertisers. You ask creators in the space about whether or not their concerns about various things are heard and they’re often very unhappy. They say that advertisers take precedent and you ask the business side the same question and they say, well, unless we have big money advertisers behind us, we’re not often listened to.”

Stokel-Walker said this is essentially how every YouTube scandal has played out over the past two years. “YouTube is very, very reluctant to take significant action up until the point that an advertiser says, ‘I’m pulling my ads from your platform.’ And then things happen overnight.”

He also pointed to a New York Times story from April, in which YouTube CEO Susan Wojcicki practically admits that they knew about a problem beforehand, but didn’t actually do anything about it until an advertiser kicked up a fuss.

Disney pulled its ads from YouTube in February over controversy around videos showcasing child exploitation
Credit: 3282700 / Pixabay
Disney pulled its ads from YouTube in February over controversy around videos showcasing child exploitation

So is there a way to fix what’s broken at YouTube? With so many moving parts, it’s certainly not going to be easy. I agreed with Stokel-Walker’s view that YouTube constantly seems to run around in a manic state, trying to put out fires that appear all over the place and just tackling the issue in front of it on that given day.

The way out, he said, likely lies in taking a look at the bigger picture and putting some effort on tackling its systemic issues before its reputation is tarnished beyond repair:

There isn’t an exact playbook for how you make online video work without issues and without advertiser scandals and things like that. However, there are potential models that can be looked at as useful prior examples. And one of them is a playbook that they are kind of already following, which is bizarrely for something that is so disruptive to traditional media, it is the traditional media playbook.

“So the reason why YouTube announced slates of new shows last year, starring the likes of Will Smith and Kevin Hart is because those people have been pre-vetted and because they are safe harbors for YouTube. It’s partly the reason I think why Will Smith was front and center in the 2018 Rewind video because he is the acceptable face of YouTube. They need to gravitate more towards the old media way of doing things in order to try and get themselves through these scandals,” he said.

Stokel-Walker continued, “In the past few years, we’ve gone from one-man band, single people producing content to more formalized structures and companies essentially churning out content in the same way that a Hollywood TV studio would in the past.”

“Now that doesn’t always work. Case in point: people like Logan Paul who, in many ways, kickstarted a lot of the big headlines about YouTube with his video in the Japanese forest over a year ago. That was because he didn’t have oversight, essentially. He didn’t have people saying, ‘this is a bad idea.’ He didn’t have a higher-up, but a lot of the biggest channels on YouTube are no longer people like Logan Paul, they are no longer individual creators who have struck lucky. These are businesses set up to capitalize on the profits that can be wrung from this new media space.”

He concluded, “Bizarrely, the one thing that is kind of robbing YouTube of its soul – the idea of broadcasting yourself – which was the site’s motto for over seven of the 14 years that it’s been in existence, is actually the thing that might help it in the long run. That’s because going with vetted creators and content makes it look more like traditional media and therefore it makes it a safer bet for advertisers, and for viewers as well.”

Stokel-Walker’s book, YouTubers, is out now from Canbury Press.


This post includes affiliate links to products that you can buy online. If you purchase them through our links, we get a small cut of the revenue.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with