Now, how did you end up there? Nobody really seems to know. YouTube's recommendation algorithm has been in the spotlight of increasing controversy over the last few years, as it seems to prefer recommending user to watch as much weird and potentially dangerous stuff as possible. Videos about conspiracy theories, blatantly racist videos, anti-vaccine videos and even videos encouraging viewers to radicalize somehow (all the styles and genres are there, from far-left radical communism through far-right extremities to good olde jihadism).
Obviously, Google, who owns YouTube, wont tell, as the recommendation algorithm is one of its business secrets. But there's a word out there that even Google doesn't really know how it works anymore, as it is driven mostly by machine learning and some level of AI.
Mozilla, the foundation behind Firefox, now wants to understand how the YouTube's recommendation algorithm really works.
In order to get data to understand the behavior of YouTube's algorithm, Mozilla has launched a browser add-on called RegretsReporter for both, Firefox and Chrome, that allows users to report "hey, I ended up watching this scary QAnon video when all I wanted to do was to watch some yoga videos".
RegretsReporter add-on
Extension will then inform Mozilla about the video and how user ended up watching it - what videos triggered the algorithm to recommend that particular video. Mozilla then hopes to be able to understand better how the system actually works.
According to the project website, Mozilla is going to publish the results of the study later and hopes that it will serve as a starting point for public debate on how recommendation algorithms should be altered in future, in order to avoid the kind of situation we're now seeing with most social networks and their recommendation mechanisms.
You can download the extension for your browser here:
RegretsReporter add-on for Firefox
Written by: Petteri Pyyny @ 21 Sep 2020 4:57