Scroll, Like, Deceive: Murky Political Accounts on TikTok before the German 2025 Elections

Political influence on TikTok is a numbers game – and someone was playing to win. Our research shows how inauthentic accounts drove engagement before Germany’s 2025 elections.

We identified 138 “murky” accounts – undeclared political profiles impersonating or amplifying parties and figures without disclosing their true affiliation. The findings highlight critical gaps in TikTok’s policy enforcement and ongoing risks to democratic integrity.

Access the Report

The report confirms that the far-right Alternative for Germany (AfD) was the primary beneficiary of murky account activity. Of the 138 identified accounts, nearly 69% either impersonated AfD politicians or falsely presented themselves as official party pages. These accounts were not only the most numerous but also the most successful in terms of reach and engagement. Between January and February 2025, murky accounts uploaded 937 videos – 97% of which promoted the AfD. Some amassed tens of thousands of followers, with the most popular accounts reaching up to 130, 000. While murky accounts promoting other parties, such as the CDU, Greens, The Left, and FDP, were also identified, they had significantly lower engagement.

These accounts deployed sophisticated tactics to evade detection and amplify their reach. Many closely mimicked real politicians, such as Alice Weidel and Björn Höcke, using similar usernames, profile pictures, and slogans. Others engaged with trending TikTok formats, leveraging hashtags like #fyp and #seischlauwählblau or using viral music to boost their visibility. Content strategies included memes, AI-generated imagery, and highly polarising narratives on immigration, LGBTQ+ rights, and foreign policy.

Although TikTok eventually removed 127 of these accounts – 111 after being reported by researchers – the report raises concerns about the ease with which deceptive political profiles can be created and scaled. TikTok’s failure to proactively enforce its own policies, combined with the lack of mandatory verification for political accounts in the EU, left the platform vulnerable to manipulation. This calls into question whether TikTok met its obligations under the EU’s Digital Services Act (DSA), which requires platforms to mitigate risks to democratic processes.

The findings from this report underline the urgent need for stronger platform accountability and enforcement mechanisms. The continued presence of murky accounts shows that TikTok remains an unreliable space for political discourse, where inauthentic activity can distort electoral debates. To protect democratic integrity in future elections, policymakers, regulators, and platforms must take decisive action against these deceptive practices.

For a full breakdown of the findings, read the report.

Co-organised by Democracy Reporting International, Forum Transregionale Studien, 
Berliner Landeszentrale für politische Bildung and Verfassungsblog.

Thursday 20 February 2025
Revaler Str. 29, 10245 Berlin

18:30 – 20:00

Supported by

Stiftung Mercator GmbH

Related posts