Una Mullally: How YouTube can suck you into an extremist swamp

Recommendation algorithm can lead online viewers up the radicalisation pathway. YouTube: increasingly the ideas we are consuming are not chosen by us but for us, and often by algorithms that know more about what we will find compelling than our own minds do. YouTube: increasingly the ideas we are consuming are not chosen by us but for us, and often by algorithms that know more about what we will find compelling than our own minds do. In her recent quarterly letter, Susan Wojcicki, the chief executive of YouTube wrote: “A commitment to openness is not easy. It sometimes means leaving up content that is outside the mainstream, controversial or even offensive. But I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views.” These pronouncements from YouTube often seem to shirk the consequences of the content. It’s not just about the rights and wrongs of what material is on the platform, it’s also about what happens when that content is consumed. When Wojcicki talks about a “broad range of perspectives”, it is as if people browsing videos on YouTube are encountering a wide range of differing opinions from every political persuasion, rather than a wide range of opinions within a particular political world view. YouTube is the filter bubble, weaponised. A paper published last month by researchers at universities in Brazil, Switzerland and the US, titled Auditing Radicalisation Pathways on YouTube, offers a compelling insight into the idea that people exploring the thin end of the wedge of certain content can be nudged towards more extreme material. “Non-profits and the media claim there is a radicalisation pipeline on YouTube,” the study’s authors wrote, “Its content creators would sponsor fringe ideas, and its recommender system would steer users towards edgier content. Yet, the supporting evidence for this claim is mostly anecdotal, and there are no proper measurements of the influence of YouTube’s recommender system. In this work, we conduct a large-scale audit of user radicalisation on YouTube.” The study analysed 331,849 videos on 360 channels and processed 79 million comments, with an additional focus on YouTube’s recommendation algorithm, examining two million recommendations between May and July 2019.

via irish times: Una Mullally: How YouTube can suck you into an extremist swamp

Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *