Uncategorized
YouTube’s recommendation algorithm discourages radicalism, researchers say, disputing reports
Susan Wojcicki, chief executive officer of YouTube Inc., introduces the company’s new television subscription service.Patrick T. Fallon | Bloomberg | Getty ImagesExamining millions of YouTube recommendations over the course of a year, two researchers have determined that the platform in fact combats political radicalization.The researchers said that, as of late 2019, YouTube’s recommendation algorithm appears…
Susan Wojcicki, chief executive officer of YouTube Inc., introduces the company's new television subscription service.
Patrick T. Fallon | Bloomberg | Getty Images
Examining millions of YouTube recommendations over the course of a year, two researchers have determined that the platform in fact combats political radicalization.
The researchers said that, as of late 2019, YouTube's recommendation algorithm appears to be designed to benefit mainstream media and cable news content over independent YouTube creators. The study, which published on Tuesday, also says that YouTube's algorithm favors left-leaning and politically neutral channels.
Independent data scientist Mark Ledwich and UC Berkeley postdoctoral researcher Anna Zaitsev conducted the study, which concluded that while radical content exists on the platform, the recommendation algorithm does not currently direct users to such videos.
“There is clearly plenty of content on YouTube that one might view as radicalizing or inflammatory. However, the responsibility of that content is with the content creator and the consumers themselves,” they said in the study. “Shifting the responsibility for radicalization from users and content creators to YouTube is not supported by our data.”
The study comes after a series of New York Times articles published earlier this year about radicalization on Google-owned YouTube. In one of the stories, 26-year-old Caleb Cain recounts how he fell into what he described as an “alt-right rabbit hole” years ago.
Since Cain's experience, YouTube has changed how it recommends content.
YouTube, like all social media platforms, has been grappling with the matter of content moderation in recent years.
“There will always be content on YouTube that brushes up against our policies, but doesn't quite cross the line,” the company wrote in a blog post earlier this month. “So over the past couple of years, we've been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation.”
Ledwich on Friday published an essay on Medium to explain the results of his research and criticize news coverage of YouTube's recommendation algorithm.
“Contrary to the narrative promoted by the New York Times, the data suggests that YouTube's recommendation algorithm actively discourages viewers from visiting content that one could categorize as radicalizing or otherwise questionable,” he said in the post.
Subscribe to Centenunlimited news
We hate SPAM and promise to keep your email address safe