\n
The YouTube recommendation algorithm more often promotes videos of a conservative tilt regardless of users’ political beliefs. The Register.
\n\n\n\n
Researchers from the Center for Social Media and Politics at New York University asked 1,063 adult Americans to install a browser extension that tracked their viewing experience.
\n\n\n\n
As a starting point, the team defined 25 videos of political and non-political content. Participants were asked to pick one and follow YouTube’s subsequent recommendations.
\n\n\n\n
Each time after viewing, the participant had to choose one of five options. For each participant, the team randomly fixed a stable stance.
\n\n\n\n
The study was conducted from October to December 2020. Each participant viewed recommended videos on YouTube daily up to 20 times.
\n\n\n\n
The extension recorded which videos the service recommended at each step. The team assessed the ideological viewpoint of each video to measure the impact of echo chambers and any hidden biases in the system.
\n\n\n\n
“We found that the YouTube algorithm does not lead the vast majority of users into extremist rabbit holes, even as it pushes users into increasingly narrow ideological ranges of content,” the study says.
\n\n\n\n
They found that, on average, the recommendation system nudges viewers a bit to the right on the political spectrum, regardless of their ideological beliefs.
\n\n\n\n
“We consider this a new finding,” the researchers said.
\n\n\n\n
The team also found that the system nudges users to watch more right- or left-leaning media depending on the starting point. As recommendations progressed, ideological distance increased, the researchers noted.
\n\n\n\n
For example, when watching moderately liberal material, over time recommendations will drift to the left, but only slightly and gradually.
\n\n\n\n
A similar effect had previously been observed on Twitter. The platform’s recommendation algorithm tends to promote posts from right-wing politicians and news outlets more than from left-wing ones.
\n\n\n\n\n\n\n\n
Political scientists from New York University suggested that this may be related to the provocative nature of conservative content, which drives greater engagement.
\n\n\n\n
As reported in September, Mozilla experts found that user interactions with YouTube do not strongly influence the behavior of the recommendation algorithms.
\n\n\n\n
Subscribe to ForkLog news on Telegram: ForkLog AI — all the AI news!
\n
