A study published in the journal Nature on Wednesday revealed that TikTok's algorithm systematically prioritized pro-Republican content in three states during the lead-up to the 2024 US elections. Researchers created hundreds of dummy accounts, conditioning them to mimic real user behavior by watching videos aligned with either the Democratic or Republican parties. They then tracked the videos recommended on these accounts' For You pages, TikTok's primary feed.
Study Findings
“We found a consistent imbalance,” the researchers wrote. Bots trained on pro-Republican content viewed approximately 11.5% more content that aligned with their views compared to their pro-Democrat counterparts. This imbalance also extended to exposure to opposing viewpoints. Bots trained on pro-Democratic content were about 7.5% more likely to encounter pro-Republican content on their For You pages.
Using 323 dummy accounts located in New York, Texas, and Georgia, the researchers analyzed over 280,000 recommended videos over 27 weeks of the 2024 presidential campaign. The analysis combined human and AI review.
Differential Content Exposure
“Our finding isn’t just about reinforcement; Democratic accounts were shown significantly more anti-Democratic content than Republican accounts were shown anti-Republican content,” said co-author Talal Rahwan. “The algorithm wasn’t just giving people what they want; it was giving one side more of what the other side says about them.”
The types of issues presented also differed. Pro-Democrat accounts received disproportionately more cross-partisan content on immigration and crime, while pro-Republican accounts saw more cross-partisan content on abortion. “This suggests the algorithm may amplify content designed to attack the opposing side on its weakest ground, which is a more targeted and arguably more concerning pattern than a uniform ideological drift,” added Hazem Ibrahim, a PhD student at NYU Abu Dhabi involved in the study.
TikTok's Response
A TikTok spokesperson disputed the findings, stating, “This artificial experiment with fake accounts does not reflect how people actually use TikTok. In reality, people discover and watch a wide variety of content on our platform which they continuously shape and can control through more than a dozen tools the authors seem unaware of.”
Study Limitations and Context
The researchers acknowledged that many users self-select content on social media, but noted that TikTok's For You page gives users less control than other platforms, being “almost entirely driven by the platform's algorithm.” A TikTok spokesperson contested this claim, saying users have many customization options. “Users don’t need to follow anyone; the system decides based on behavioral signals like watch time. That makes it a uniquely clean setting for studying algorithmic influence,” Ibrahim explained.
The study did not analyze the impact of these videos on political beliefs or the reasons for the imbalance. It only captured early stages of user experience and examined English-language transcripts, missing visual or non-English cues. A TikTok spokesperson argued these limitations show the study is not representative of real user behavior.
Despite these caveats, the authors stress the relevance of their findings to debates on platform transparency and algorithmic accountability. The Nature article noted that under the EU Digital Services Act, large platforms must assess electoral risks, while in the US, First Amendment protections give platforms more editorial freedom. Zaki concluded: “In an environment where margins are thin, systematic differences in the kind of political information recommended to tens of millions of young voters are worth taking seriously.”



