This week on The Vergecast, we’re doing something a little different — we’re introducing you to our newest reporter, Julia Alexander, who joins The Verge to cover YouTube, Twitch, and more.
Nilay sat down with Julia to talk about the controversy around YouTube’s recommendation algorithm, the rise of parasocial relationships, and how creators are cultivating an audience of very savvy media consumers.
Below is a brief, edited transcript of their conversation about how YouTube’s recommendation algorithm is radicalizing young people on the far right:
Nilay Patel: What’s broken with the platform?
Julia Alexander: So the number one issue I think is the recommendation algorithm. It’s radicalizing so many people. I spoke to a lot of kids, for example, who came up through Gamergate — they were like 13, 14, when Gamergate first happened in 2014 — who are now 18, 19 and they were saying YouTube is the main reason that they believe a lot of stuff they believe. ‘Cause they would watch a video from, like, someone like Sargon of Akkad that would give them recommendations into this whole area of people — it’s disturbing and it’s just opinions. And I talked to high school teachers a lot and the kids that we talk to, they just use YouTube for news and they’re getting really bad sources to back up their opinions.
So I think that’s the most disturbing part of YouTube, and it’s not something that you know how to fix. I mean, they’re interested in fixing it, it’s just not something they’re capable of fixing and I think that’s a Google problem not just a YouTube problem.
Nilay: Why do you think they’re not capable of fixing it? ’Cause I mean they are trying some things, right? They’re labeling things in different ways. I think the Wikipedia links that they’re now adding are adorable in their way.
Julia: Yeah. The moon landing happened.
Nilay: Yeah. I remember — I won’t name names — CES one year, end of the day, long day, everyone’s having a drink and I just remember suddenly the conversation became about whether 9/11 was an inside job. I was like, I was an adult when that happened. That was 100 percent real. I promise you, every journalist in America would be chasing that story forever if it hadn’t, but he’s like, “Well, I watch a bunch of YouTube videos.” And I just remember thinking to myself, “Wow this is, like, a lot.”
Julia: There was a moment with Kyrie Irving, he was on stage somewhere and they brought up the fact that Kyrie believed the Earth was flat and he very explicitly said, “I was watching a bunch of YouTube videos and I got into the hole.” On YouTube it’s super easy to find something and it starts off really fun with like, “Is the moon landing real?” I’m gonna watch a conspiracy video, but that quickly becomes topics that aren’t as ludicrous that are even scarier.
So you use keywords like “liberals” or “feminism” or “conservatives” and it gets into scary territory where people spend a day formulating a very well put together 20-minute argument that is based on bad faith. And it travels well and it spreads to Twitter, and it spreads to Instagram, and it’s like this cross promotional thing. And I don’t think YouTube knows how to stop people from gaming their system, which is upsetting.
And that’s the conversation I have with creators a lot where they outthink YouTube all the time. Abusive tags, abusive metadata. It’s something that websites are aware of and YouTube doesn’t know how to fix it, and it’s like the most blatant issue that it’s facing a lot of creators is they just put in however many tags they want. You could put “Google” or “Android” into a YouTube search and you’re going to get far-right conspiracy theories. Because people just realize that you’re searching for Android and they can game that pretty easily.