Homa Hosseinmardi is a post-doctoral research associate at the University of Pennsylvania's Computational Social Science Lab, led by Duncan Watts. Her research centers around holistic and scaled studies of the information ecosystem, with the ultimate goal of developing models to disentangle the role of human and platform components in the production and consumption of problematic content. She is the lead researcher of the Penn Media Accountability Project (PennMAP) and co-founder of the CyberSafety workshop series, co-located with conferences such as CIKM and the Web Conference. Dr. Hosseinmardi was the recipient of the “outstanding research award’’ in her PhD, and her research has been published in more than twenty-five peer reviewed outlets—including computer science conferences and top journals such as PNAS, IMWUT, and TKDD—and featured in the press.

Talk: Examining algorithmic bias and radicalization on YouTube

Attend this talk virtually via Zoom // passcode 800459

Abstract: With the ever-increasing prevalence of digital technology, imagining the counterfactual of a world without it is becoming more difficult every day. Online and social media platforms have changed every aspect of our daily lives: the ways we receive information, educate ourselves, and socialize. Yet, all these advancements in cyberspace have been accompanied with complications, similar to any other technology. Recently, anecdotes of YouTube users beginning from benign content such as food videos and falling into rabbit holes of extreme political content have been portrayed by the media as a feature inherent to the platform.

In this talk, I will present my recent work where we tested this hypothesis using a representative panel of more than 300,000 Americans and their individual-level browsing behavior, on and off YouTube, over a four-year period. We showed that the pathways by which users reach far-right videos are diverse, and only a fraction can plausibly be attributed to platform recommendations. Although our results do not rule out the possibility that recommendations drive engagement for the heaviest consumers, they are strongly consistent with the explanation that consumption of radical content on YouTube is, at least in part, a reflection of user preferences. I conclude by discussing how trends in video-based political news consumption are determined by a complicated combination of user preferences, platform features, and the supply-and-demand dynamics of the broader web, rather than simply the policies and algorithmic properties of a single platform.