Blog
Youtube and alternative facts
This blog is about this blog by Guillaume Chaslot. (Blogception). Chaslot is a programmer who formerly worked at both Microsoft and Google and now works for Bayes Impact, an NGO that creates open source software and algorithms for the public good. He recently wrote a recommendation explorerwhich extracts the most frequently recommended videos from Youtube for a certain query. He also compared the first twenty recommended videos with both Google and Youtube searches for the same query.
The results are quite shocking: it turns out that Youtube highly recommends videos that propagate all sorts of conspiracy theories and alternative facts. If you watch a video on climate change, 70% of the subsequently recommended videos claim that climate change is a hoax. Whereas if you ask google whether global warming is real only 25% of the first twenty results claim it is not (still way too many in my opinion). Youtube’s recommendations are even more egregious in other cases: if you ask whether the earth is flat or round, for instance. 90% of the recommended videos propagate ‘flat earth theory’.
The reason why Youtube recommends so many ‘alternative facts’ is that its recommendation algorithm is aimed at increasing a user’s “watch time”, and apparently conspiratorial videos do much better than proper information. Perhaps this is not surprising if someone searching for the proper information is prone to be recommended a lot of nonsense after watching just a single serious video.
Chaslot is very cautious and writes that “The point here is not to pass judgement on Youtube. They’re not doing this on purpose, it’s an unintended consequence of the algorithm.” But I think we might have to be less charitable. Even if the consequence of your action is unintended you can still be responsible for it if the consequence was a foreseen side-effect. When I throw a party till 4 a.m. on a week day, I surely do not intend to wake my neighbours, but I can still be responsible for it. And I should have known better, even though my intention was just to have a good time with friends. By now, Youtube should know that their algorithm propagates conspiracy even if the original aim of the algorithm merely was to increase watch time. And therefor they are responsible for spreading nonsense.
But to what extent does a private company have public responsibilities? This is a difficult question to answer in general. However, Youtube should be aware that they do not just list the conspiratorial videos, they actively recommend them. And, as many social epistemologists would tell you, recommendation also provides assurance. If I recommend a plumber to you then I assure you that he will do good work (or at least that he has done good work on my plumbing). Hence, when Youtube recommends a video that explains how the Clintons ran a paedophile ring out of a pizza parlour in New York, what are they exactly doing? Whatever it is, it does not seem to be something that a company who claims that they want to “provide a forum for people to connect, inform, and inspire others” should be doing.
Leave a Reply