The Algorithm Votes Too: How TikTok Data May Be Shaping Elections

TikTok election manipulation

A stylized digital illustration of a smartphone displaying the TikTok logo, with shadowy algorithmic code flowing behind it, symbolizing hidden influence on political content.

The Perfect Weapon for Influence

TikTok isn’t just a platform for lip-sync videos and cooking hacks. It’s a finely tuned behavioral engine, constantly feeding its algorithm with your every like, swipe, pause, and rewatch. What makes it different from other platforms? Granularity. TikTok doesn’t just know that you like politics—it knows which kind of political content holds your gaze for 3.2 seconds longer. It knows what emotion you show on your face while watching. Now imagine that kind of microdata in the hands of operatives aiming to influence public sentiment during an election.


Case 1: The Philippines, 2022 — “The Marcos Makeover”

As Bongbong Marcos campaigned for the presidency, TikTok was suddenly flooded with nostalgic clips praising the Marcos dictatorship—a time many remember for martial law and repression. But on TikTok, the narrative shifted: upbeat music, soft color grading, smiling faces. “Golden Age” was a common caption. Thousands of such videos appeared across accounts with no prior history of political content.

According to local watchdog groups, many of these accounts were traced back to content farms in Vietnam and Indonesia—outsourced manipulation. And the kicker? These videos rarely showed Marcos Jr. himself. Instead, they used emotion-laden content about Filipino pride and historical ‘redemption’. The result? Bongbong Marcos won by a landslide, despite facing widespread skepticism just months earlier.


Case 2: Kenya, 2022 — The Digital Silence

In the weeks leading up to Kenya’s general election, several political hashtags tied to opposition leader Raila Odinga suddenly stopped appearing on TikTok’s For You pages. Independent digital researchers noted a sharp decline in the visibility of Odinga-related content, even while pro-government videos surged.

TikTok denied any algorithmic manipulation, but offered no data transparency. The Center for Digital Democracy later revealed that content moderation operations had been quietly relocated to the UAE, where filtering guidelines may be influenced by political deals unrelated to Kenyan sovereignty.


Case 3: United States, 2024 Primaries — The Meme Flood

During the 2024 Republican primaries, young voters on TikTok began posting overwhelmingly anti-Ron DeSantis content—funny, but overwhelmingly hostile. Most of these came from seemingly organic meme pages. But analysis by cybersecurity firm Naxon showed that many videos shared identical metadata and upload timestamps, suggesting they were bulk-uploaded using scheduling software.

Further forensic analysis revealed cloud-based editing templates with embedded TikTok API hooks—used for mass uploading via unauthorized methods. The content wasn’t banned. It was viral.

And here’s the odd twist: while anti-DeSantis content exploded, Trump-related content was subtly sanitized. Old controversial clips were getting buried, while flattering edits—highlighting his dance moves, comedic moments, or populist slogans—received unexplained boosts. No official source claimed credit.


Case 4: Brazil, 2022 — The Invisible Hashtag

In the days before Brazil’s election, anti-Lula hashtags like #ForçaBolsonaro mysteriously dropped from TikTok’s trending list, despite being used by millions. Pro-Lula accounts, many newly created, suddenly spiked in engagement. Users reported seeing Lula-friendly videos even after engaging heavily with Bolsonaro-related content.

Investigations later showed that TikTok’s Brazilian moderation team had collaborated with a “third-party safety partner” whose leadership had publicly endorsed Lula’s party. Despite the massive outcry, no transparency report was ever published.


“A stylized digital illustration showing TikTok interface elements combined with keywords like ‘voting’, ‘misinformation’, and ‘algorithm’. The visual represents how trending content may be manipulated to influence public opinion during elections.”

The Real Question: Who Controls the Feed?

If TikTok’s feed can be quietly steered without users noticing, is it not a form of invisible censorship—or worse, manufactured consent? Every manipulated trendline, suppressed hashtag, and boosted personality shapes public perception, nudging users toward a specific worldview.

Elections aren’t just about voting booths anymore. They happen in your pocket, with every scroll.

Leave a Comment

Your email address will not be published. Required fields are marked *