Background
In the lead up to the 2025 German Federal Elections, many civil society organisations undertook monitoring efforts to investigate major party’s digital strategies and online electoral risks. Because the elections also represented one of the first instances where all current EU regulatory frameworks related to the digital space, – ie., the Digital Services Act (DSA) and the AI Act – are in effect, such research was of an even more critical importance.
To hear from the broader civil society community, on Tuesday, February 25, DRI’s Digital Democracy team held a roundtable discussion in Berlin with policymakers, researchers, and advocates. Attendees included representatives from: AlgorithmWatch, the German Digital Services Coordinator (DSC), Media Authority Berlin-Brandenburg, CeMAS, the DSA 40 Data Access Collaboratory, HateAid, Institute for Strategic Dialogue, and Das NETTZ.
During the event, we discussed findings from our recent work around the German elections including how the chatbots performed in providing electoral information in English and German, murky account activity on TikTok which potentially violates TikTok policies and EU regulations, how the AfD have been engaging with voters on Facebook and how recommender systems respond to different political profiles.
Key Research Insights
- Research by Algorithm Watch similarly found that prominent large language models (LLMs) are not competent at providing information about regional elections, even very basic facts. In addition, using the APIs of these models was noted as problematic for research purposes as API responses and interface responses as seen by everyday users from LLMs significantly differ.
- Joint research between AlgorithmWatch and The Atlantic Council found that Elon Musk’s support of the AfD appeared to be more centered on explicit support for Alice Weidel and less on the party overall.
- Other organisations have experienced similar denials of access for data from X. The only organisation that did receive access was restricted to a single, narrowly defined project, and the data was provided only after the elections it aimed to monitor took place. When they tried to reapply, they were similarly denied.
- An investigation by ISD into TikTok’s recommender algorithm found that content by AfD fan pages and affiliated accounts were more likely than other parties to be promoted on users’ feeds. This finding is aligned with our own research.
- Foreign interference was also observed, with CeMAS identifying yet another Doppelganger network on X and reporting them.
- Access to data via the TikTok API and VC remains an ongoing challenge for multiple organisations. Delays and issues have been reported to the appropriate DSCs and the EU.
Following this research-based discussion, we also explored advocacy strategies to strengthen enforcement of the DSA risk mitigation framework, as well as address the challenges and opportunities for regulators, civil society, and tech platforms in ensuring transparency in the post-election landscape.
Policy Considerations
- Civil Society should act more collaboratively to forge strategies that reinforce the EU’s stance in safeguarding the DSA and DMA against mounting pressure, including from political forces like the Trump administration.
- When reporting on moderation or compliance issues with platforms, civil society organisations should focus on counterexamples from platforms where Trust & Safety and recommender systems work better. By highlighting successful examples from other platforms (e.g., Bluesky), we demonstrate that improvement is possible. Findings should, in general, be framed in a way that helps platforms seem the value-gained for them to facilitate implementation of proposals coming from civils society.
- Organisations need to shift from providing research evidence to offering legal evidence that can support stronger legal investigations. While some participants expressed concerns about narrowing the scope of research to only what is legally actionable, collaboration is key to balancing academic insights with the need for legal enforcement.
- The concept of Foreign Information Manipulation and Interference (FIMI) may be losing its ability to bring together different political views on the need to regulate online platforms. Other issues that could create a broader consensus include online risks to minors, online gender-based violence, and dark patterns.
What are the next steps?
In the short term, we should continue engaging with the European Commission as well as the new German government. It’s important to raise awareness among conservative politicians about the significance of the DSA. By holding platforms accountable and promoting online safety, we can reduce polarisation and counter US propaganda. In the medium term, we should be ready to contribute and share insights during the DSA evaluation, which is expected in 2027, though it could happen sooner. Long-term goals are still to be defined.
This event was organised by Democracy Reporting International gGmbH as part of the access://democracy project funded by the Mercator Foundation. The contents or any related publications following this event do not necessarily represent the position of the Mercator Foundation.