Researchers reduce polarization on X by adjusting algorithm

  1. Home
  2. Science
  3. Researchers reduce polarization on X by adjusting algorithm
  • Last update: 11/30/2025
  • 2 min read
  • 148 Views
  • Science
Researchers reduce polarization on X by adjusting algorithm

Researchers at Stanford University have unveiled a new tool designed to decrease political hostility on X feeds by rearranging posts instead of removing them. The findings, published in the journal Science, indicate a future where users could manage their own social media algorithms across multiple platforms.

After Elon Musk acquired Twitter in 2022, he lifted numerous restrictions that had aimed to protect users from hate speech and misinformation. This shift amplified the visibility of posts from users with right-leaning views.

The Stanford team, working independently from X, developed a browser extension that reorganized participants feeds. Content exhibiting partisan hostility or anti-democratic sentiments, including posts promoting violence or imprisonment of political opponents, was moved lower. An AI language model analyzed the feed in real time. Importantly, no content was deleted or blocked, unlike traditional ad-blockers.

The trial involved 1,256 X users ahead of the 2024 presidential election. Participants were randomly assigned to two groups for one week: one group saw polarizing posts prioritized, while the other saw them ranked lower. Results showed that participants whose hostile content was demoted developed more positive attitudes toward the opposing political party. The effect was observed across both liberal and conservative users.

"Social media algorithms shape our lives, yet until now only platforms controlled them," said Michael Bernstein, professor of computer science at Stanford and lead author of the study. "Our approach allows researchers and users to gain that control."

Bernstein added that the tool could be a step toward creating social media environments that not only reduce partisan hostility but also build social trust and healthier democratic dialogue.

Josephine Schmitt, scientific coordinator at Germanys Center for Advanced Internet Studies, noted that the study showed strong effects on emotional tensions between political groups. "Even minor algorithmic changes can significantly shift attitudes toward the political 'other,' proving that feed sorting is not neutral," she said.

Philipp Lorenz-Spreen from the Max Planck Institute for Human Development cautioned that results might differ in countries without a clear two-party system and where polarization is less defined. He suggested repeating similar experiments internationally. Schmitt also noted that Xs influence is smaller in the daily media consumption of other countries compared to the United States. Neither Schmitt nor Lorenz-Spreen participated in the study.

Addition from the author

Analysis: Algorithmic Control as a Path to Reduced Polarization

The Stanford study introduces a novel approach to mitigating political hostility on social media without removing content. By reordering posts rather than censoring them, the tool demonstrated measurable improvements in users' attitudes toward opposing political parties. The trial of 1,256 participants showed that even minor changes in feed ranking can influence perceptions and reduce partisan hostility.

This research highlights the potential for users and independent researchers to gain control over algorithmic influence, a domain traditionally monopolized by platforms. The findings suggest that algorithmic design is not neutral and that feed curation can actively shape political attitudes, providing a mechanism to strengthen democratic dialogue.

While promising, the study’s results are context-specific. Experts caution that effects may differ outside the U.S. or in political systems without a dominant two-party structure. International replication will be essential to understand the broader applicability of these findings.

Follow Us on X

Stay updated with the latest news and worldwide events by following our X page.

Open X Page

Sources:

Author: Sophia Brooks

Share This News