New Study Shows This Social Platform Can Shift Your Politics To The Right — And It Happens Very Fast

Source: huffpost.com
248 points by huffpost a day ago on reddit | 26 comments

It takes only a few weeks for X, the platform formerly known as Twitter, to nudge users’ political views to the right.

That’s the key takeaway from a new study published in the scientific journal Nature this month that examines the platform’s outsized influence on political views. X was taken over in 2022 by Elon Musk, an ally of President Donald Trump, and now has more than 600 million monthly active users globally.

Although researchers have long suspected that social media algorithms can shape political attitudes, this study is among the first to test that idea through a real-world randomized experiment on a major platform, said Germain Gauthier, one of the study’s authors.

“X was a great opportunity to study the effects of algorithmic feeds, because the platform explicitly allowed users to opt out of their algorithmic feed and replace it with a simple reverse-chronological ordering of the posts of accounts you followed,” said Gauthier, who is an assistant professor in the Department of Social and Political Sciences at Bocconi University in Italy. “We could look at a sample of X users without relying on the platform’s collaboration.”

The researchers randomly assigned 4,965 active U.S.-based X users to one of two groups: One group looked at X’s default “For You” timeline, an algorithm that displays a stream of posts from accounts you have chosen to follow on X as well as recommended posts.

The second group used a chronological feed, showing only posts from accounts the users followed in the order they were posted. The experiment ran for seven weeks in 2023.

What’d they find after the seven weeks?

“Over the seven-week treatment period, users who switched from the chronological to the algorithmic feed shifted their political opinions to the right,” Gauthier told HuffPost.

“For You” users were 4.7% more likely to prioritize Republican policy issues (i.e., immigration, inflation, and crime) and generally more likely to view the criminal investigations into Donald Trump as unacceptable.

The study showed that users' new follow patterns persisted even after switching back to the chronological feed. In other words, toggling off algorithmic recommendations didn't create a neutral slate for the X user.

NurPhoto via Getty Images

The study showed that users' new follow patterns persisted even after switching back to the chronological feed. In other words, toggling off algorithmic recommendations didn't create a neutral slate for the X user.

The algorithm group was even more likely to hold pro-Kremlin views on the war in Ukraine; they were 7.4% less likely to view Ukrainian President Volodymyr Zelenskyy positively, and scored slightly higher on a pro-Russian attitude index overall, Gauthier said.

“They also began following more right-wing accounts on the platform, particularly political activists,” he added.

This shift occurred even though users did not report changes in party affiliation or levels of affective polarization. In the sample, 46% of participants identified as Democrats and 21% as Republicans. The group was 78% white and 52% male, and it skewed relatively well educated, with 58% having completed at least four years of university.

There was no change in perspective for those who were previously using the algorithmic feed and were switched to the chronological one.

What might be most unnerving is that the study showed users’ new follow patterns persisted even after switching back to the chronological feed. In other words, toggling off algorithmic recommendations didn’t create a neutral slate for the X user.

“Deeply held concepts like partisan identity are unlikely to move over such short timeframes but what’s striking is that opinions on current politics did shift in just a few weeks,” Gauthier said. “That naturally raises the question of what years of exposure might do. For now, we simply don’t have the answer.”

How X’s algorithm turns users more right wing

Gauthier and his team also examined how the algorithm drove these shifts, focusing on what it chose to amplify. Conservative-leaning posts were about 20% more likely to appear in algorithmic feeds, while liberal posts were only 3.1% more likely. The “For You” algorithm on X also significantly demoted posts from traditional news organizations, while promoting or boosting content from political activists.

“While our study did not address this, one could imagine that removing a large share of traditional news from users’ feeds would affect how politically informed they are,” he said.

Findings about how much of an echo chamber X is may come as no surprise to many observers, given that Musk makes no secret of his political views: While pushing for issues like reduced immigration and deregulation of business and amplifying the so-called “Great Replacement” theory, the tech billionaire has taken a keen interest in pushing conservative movements and right-wing figures, particularly since his 2022 acquisition of the platform.

Musk put over $200 million into his own pro-Trump “America PAC,” working on voter mobilization in key swing states and acting as a surrogate at rallies. On X itself, Musk amplified pro-Trump content and offered swing-state voters $1 million to sign his America PAC petition to drum up voter registration.

"For You" users were 4.7 percentage points more likely to prioritize Republican policy issues (i.e., immigration, inflation, and crime) and generally more likely to view the criminal investigations into Donald Trump as unacceptable.

Bloomberg via Getty Images

"For You" users were 4.7 percentage points more likely to prioritize Republican policy issues (i.e., immigration, inflation, and crime) and generally more likely to view the criminal investigations into Donald Trump as unacceptable.

Outside of the U.S., the richest man in the world has publicly endorsed Nigel Farage’s Reform party in the U.K., and the German Alternative für Deutschland, or AfD, a far-right political party with deep ties to the neo-Nazi movement.

Earlier this month, French police raided X’s offices as part of a year-long investigation into “political” algorithm manipulation. Musk maintains the probe is purely a political vendetta.

On X, he very much “likes” the type of content the “For You” tab amplifies: The Guardian analyzed his activity in January and found that he posted about the white race being under threat, alluded to eugenics, or promoted anti-immigrant conspiracy content on 26 out of 31 days in January.

How much of this is a unique X problem? Does this sort of echo chamber effect happen on more left-leaning social media platforms — say, on Bluesky’s “Discover” tab rather than its “Following” tab? (There’s been much hubbub on the right about Bluesky’s liberal echo chamber problem.)

Gauthier said he couldn’t speculate on Bluesky or Meta’s Threads, another X competitor. Although his experiment focused specifically on X, he emphasized that his broader goal was to determine whether feed algorithms can influence political opinions. For now, he added, researchers are only at the very early stages of quantitatively understanding the societal effects of algorithms.

“Whether you are a conservative or a liberal, these findings should interest you. They raise deeper questions about how algorithms shape our consumption of political information,” Gauthier said. “It’s time people realize that these algorithms shape our societies, and for them to reflect on what kind of influence they are comfortable with, and how we should think about accountability.”