Vox has an explainer today about how YouTube automatically steers users toward deranged and conspiratorial videos, even if the users have started out with perfectly ordinary interests. This has been explained before, but it keeps needing to be explained because it’s so incomprehensible in normal human terms: YouTube’s algorithms, which are built to keep people watching as many videos and video ads as possible, have apparently followed that instruction to the conclusion, as Zeynep Tufekci wrote in the New York Times, “that people are drawn to content that is more extreme than what they started with—or to incendiary content in general.”
The humans who run YouTube (and run its algorithms) aren’t exactly proud of the fact that their product showcases misogynist rants or pseudoscientific nonsense or apocalyptic conspiracy theories. But their position is that what happens inside their black box is extremely hard to correct or regulate, and on the scale at which YouTube operates, it’s impossible to apply human judgment to every case. They wish there was a way to serve up video recommendations without poisoning people’s minds till someone believes it’s necessary to invade a pizza parlor with an assault rifle, but that’s a real tough computational challenge.
What this line of defense leaves out is a very basic, obvious fact: YouTube already has access to an algorithm that can sort through videos without promoting unhinged fringe material. It’s called Google. YouTube is part of Google. When and if Google’s search algorithms start giving Google users fringe results, Google treats that as a failure and tries to fix the algorithms.
In the Vox piece, Jane Coaston writes about what happened when she searched “Trump” on a site that tracks YouTube’s video recommendations:
The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)
Here is what came up when I tried a search for “Trump” on the “Videos” tab of Google.com:

Searching “Hillary Clinton” recommendations from YouTube led Coaston straight to conspiracy theories, including murder. Here’s “Hillary Clinton” on a Google video search:

Somehow, where YouTube declares itself helpless before the enthusiasms of the public, Google is perfectly capable of serving up Hillary Clinton content without going off the deep end.
It’s true that Google and YouTube are different services, with different architecture. Google was built to index the Web and sort through existing material; YouTube hosts video content itself. That distinction, though, isn’t as big as it might seem—Google video search points toward video on the websites of various news organizations, such as the Washington Post or AP News, and YouTube has to point to YouTube, but the Washington Post and AP News are also YouTube content providers. Pretty much everyone is.