-
Team TechTree
07:29 31st Jan, 2020
Are YouTube Videos Radicalizing Users? | TechTree.com
Are YouTube Videos Radicalizing Users?
Research presented at a global conference in Spain suggests that it may be true in the case of far-right ideologies
When sociologist Zeynep Tufekci called YouTube an engine for radicalization in an article published by the New York Times in the spring of 2018, the world was polarized as barring a few who agreed, many believed it to be the result of some fear psychosis and others thought it was just too far-fetched to be true.
However, a research paper presented at a global conference that studies fairness, accountability and transparency in social-technical systems seems to suggest that YouTube does indeed play a role in radicalizing users. The paper is the result of research conducted by Switzerland’s Ecole Polytechnic at Lausanne and Brazil’s Federal University of Minas Gerais.
The paper titled “Auditing Radicalization Pathways on YouTube” was presented at the ACM FAT Conference 2020 at Barcelona with researchers analysing the comments and views section of YouTube to suggest that certain right-leaning YouTube communities were actually acting as gateways to fringe far-right ideologies.
The researchers reveal that they had analysed as many as 330,925 videos posted on 349 channels classified into four types, viz., Media, the Alt-lite, the Intellectual Dark Web and the Alt-right besides a whopping 72 million comments for their paper. It suggested a pipeline effect over a number of years where users who started with comments on Alt-lite and IDW content shifted to doing so on far-right content over a period of time.
Ironically, this is exactly what Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina, said in her now famous New York Times article. She noted that after watching a few videos of Donald Trump in the Presidential race back in 2016, she found that YouTube was recommending auto play videos “featuring supremacist rants, Holocaust denials and disturbing content.”
The researchers say that “a significant amount of commenting users systematically migrates from commenting exclusively on milder content to commenting on more extreme content” and suggest that this finding provides “significant evidence that there has been, and there continues to be, user radicalization on YouTube, and our analyses of the activity of these communities… is consistent with the theory that more extreme content ‘piggybacked’ on the surge in popularity of I.D.W. and Alt-lite content… We show that this migration phenomenon is not only consistent throughout the years, but also that it is significant in its absolute quantity.”
Of course, the absence of data from Google on how it recommends auto play videos and the impact of personalization on search results are factors that the researchers have stated as impediments to determine the exact manner in which the migration of users from moderate to extremist views happens on the platform.
“We do find evident traces of user radicalization, and I guess the question asks why is YouTube responsible for this? And I guess the answer would be because many of these communities they live on YouTube and they have a lot of their content on YouTube and that’s why YouTube is so deeply associated with it,” Horta Ribeiro, the lead researcher, said.
While the latest research is indeed troubling, the fact remains that YouTube has also tightened its approach towards far-right and extremist content over the past few years. In fact, the growing voice over political ads, deep fakes and hate speech and online harassment has resulted in the Google-owned company tighten norms further.
However, there still remains the question of how AI virtually ensures that the contrarian viewpoint never surfaces on our radar. Tufekci had experimented this by creating a new account and watching Hillary Clinton’s videos in 2016 and found that soon the auto play suggestions revolved around moderate content.
Now, the research done by these academics is proving what we all knew was Google’s way of making money. The question is should it take responsibility for driving us over the cliff?
TAGS: YouTube, Google, Radicalization, Right-wing, Left-wing, Speeches
- DRIFE Begins Operations in Namma Bengaluru
- Sevenaire launches ‘NEPTUNE’ – 24W Portable Speaker with RGB LED Lights
- Inbase launches ‘Urban Q1 Pro’ TWS Earbuds with Smart Touch control in India
- Airtel announces Rs 6000 cashback on purchase of smartphones from leading brands
- 78% of Indians are saving to spend during the festive season and 72% will splurge on gadgets & electronics
- 5 Tips For Buying A TV This Festive Season
- Facebook launches its largest creator education program in India
- 5 educational tech toys for young and aspiring engineers
- Mid-range smartphones emerge as customer favourites this festive season, reveals Amazon survey
- COLORFUL Launches Onebot M24A1 AIO PC for Professionals
TECHTREE