澳门六合彩开奖结果走势图

How Automated YouTube Recommendations Foster Polarization

Blogs
Three side by side screenshots from YouTube
A screenshot from the YouTube Audit demo site shows the difference between recommended videos for left-leaning, center and right-learning users. Photo courtesy of Muhammad Haroon.

A new study from 澳门六合彩开奖结果走势图 suggests that recommendation algorithms on sites like YouTube and TikTok can play a role in political radicalization. If the algorithm sees that a user is watching a lot of biased political videos, it can trap them in what the researchers call a 鈥渓oop effect鈥 where the system will continue recommending similarly biased and potentially more extreme content on their homepage and sidebar. Left unchecked, this can lead to polarization and radicalization for both right-wing and left-wing users.

鈥淭he system is doing what it鈥檚 supposed to be doing, but it has issues that are external to the system,鈥 said Computer Science Ph.D. student Muhammad Haroon, who led the study. 鈥淯nless you willingly choose to break out of that loop, all the recommendations on that system will be zeroing on that one particular niche interest that they鈥檝e identified. This can lead to partisanship and increases the divide that is facing American society.鈥

To quantify bias, the team assigned each video with a score of -1 (far left) to +1 (far right) based on the ratio of left to right-wing accounts that shared the video on Twitter. For example, if users who followed Alexandria Ocasio-Cortez shared a video more often than users who followed Ted Cruz, the video was given a greater negative score. Their initial tests proved the concept, as videos from far-right channels like Breitbart routinely showed scores close to +1 and vice versa. 

Next, the team trained sock puppets, artificial entities that act like users while being controlled by the researchers. Each sock puppet was given a series of right- or left-leaning videos to watch every day, and then the team would compare the recommendations on the sock puppet鈥檚 homepage to see if its recommended videos gradually became more biased.

鈥淪ome notion of radicalization was achieved in these sets of results,鈥 said Haroon. 鈥淲e found that it鈥檚 worse for the right wing, but it鈥檚 a left-wing issue as well.鈥

Algorithm a moving target

Initially, radicalization seemed to be twice as bad for right-wing users, but Haroon says with time and with a few key channels getting suspended or banned, the problem became more or less equivalent on both sides. He also noted that what鈥檚 considered far left鈥攑rogressivism鈥攁nd far right鈥攍argely conspiracy theories鈥攁ren鈥檛 equal in extremity, and that the algorithm is always changing as new videos are uploaded and deleted every day.

That鈥檚 why the team has kept a few sock puppets running over the past several months. They鈥檝e shared their results in a demo site called , where people can compare the recommended homepage videos of left, center and right-leaning users on any given day. Haroon is especially interested to see if and how the recommendations change during major political events like elections.

鈥淭his algorithm is a moving target,鈥 he said. 鈥淚t鈥檚 always going to evolve so hopefully, going forward, we will have this whole snapshot of YouTube鈥檚 entire algorithm evolution over time.鈥

They are also working on solutions to prevent radicalization. One idea developing a system to monitor bias on a user鈥檚 homepage and systematically 鈥渋noculate鈥 it with unbiased videos. They鈥檙e also looking into how non-political content like cooking videos or sermons can still have an inherent political bias and play a role in polarization.

鈥淭hese systems are all opaque to the end user, who doesn鈥檛 really know what data they鈥檙e feeding off,鈥 he said. 鈥淥ne of our larger themes is making them more transparent to give users a bit more control over the algorithm and what they鈥檙e being recommended.鈥

Haroon acknowledges that inoculation may make recommendations less relevant, which might not be in companies鈥 best interests. However, he argues that it would mainly hurt political extremist content creators and disincentivize them from using the platform, which would be a net positive for the world.

鈥淭his is a problem that鈥檚 bigger than YouTube,鈥 he said. 鈥淚t鈥檚 a problem of how online systems operate and then zeroing in on how a problem persists.鈥

The study is part of a collaboration between Haroon鈥檚 PI, Associate Professor , CS Professors  and  and Professor  in the Department of Communication that looks at 鈥済ood AI vs. bad AI鈥濃攖he ethics and fairness issues associated with online AI systems. Though he comes from a pure technical background, Haroon was drawn to the project because of the chance to tackle social issues as well.

鈥淚 was interested in how I can use my knowledge and skills as a computer scientist to make an impact outside the computer science community,鈥 he said.

Media Resources

Noah Pflueger-Peters writes about research at the 澳门六合彩开奖结果走势图 College of Engineering. 

Primary Category

Secondary Categories

Driven by Curiosity

Tags