On 22 January, the European Commission published updated subscription documents by major platforms on the Code of Practice on Disinformation.
DRI regrets that social media platforms decided to scale back many of the commitments they made in 2022 under this Code of Practice, which will now become a Code of Conduct under the Digital Services Act (more on that below).
The Code consists of many commitments aimed at ensuring an effective fight against disinformation. Yet, platforms reduced their commitments by 31%. This reduction affects crucial measures, notably co-operation with and support for the fact-checking community. YouTube, Google Ads, Google Search, and Microsoft’s LinkedIn and Bing Search have completely withdrawn from all commitments related to cooperation, collaboration, and financial support for fact-checkers. Meta has so far upheld its commitments; however, the platform made it clear that this stance could change following Mark Zuckerberg’s announcement and the rollout of Community Notes. TikTok has also maintained its commitments to fact-checking, but on the condition that other platforms do the same. In short, there is a tangible risk that the European system of fact-checking will collapse.
The Code of Practice becomes a Code of Conduct
Last year, the European Commission began the process of transforming the Code of Practice on Disinformation into a Code of Conduct under the Digital Services Act. The conversion was supposed to mark a significant milestone. Codes of Conduct are key to the effective implementation of the Digital Services Act. They provide more detailed standards, which had been elaborated jointly by companies after a constructive and regular exchange between multiple stakeholders. The conversion to a Code of Conduct under the DSA means the Code is more authoritative, providing a benchmark for implementing DSA obligations.
We are hence very disappointed that during this process, platforms have scaled back many of the commitments made in 2022.
You can see an overview of all the changes in platform’s commitments in our summary deck:
As mentioned above, several platforms have scaled back their commitments to fact-checking. Collaborations with fact-checkers have been built up diligently with platforms over the years. They are now wrongly presented as “censorship”. This claim is false, as the responsibility for content moderation, including content removal, has always rested with the platforms, not fact-checkers. Although fact-checking is not a perfect solution, research demonstrates its effectiveness in correcting false beliefs about specific claims. Platforms also cut back commitments in other critical areas, including measures to empower the research community (down by 52.43%) and those addressing political advertising (down by 55.23%).
Platforms such as Google, Microsoft, and TikTok have justified these reductions by claiming they no longer allow political ads. However, DRI has repeatedly flagged cases of political ads on TikTok. The recent events in Romania, where courts ordered a repeat of the presidential elections due to foreign interference campaigns on TikTok, highlight the urgency of enforcement reports and transparency around these so-called “no-political-ads” policies.
We remain a committed signatory of the Code, actively contributing evidence of threats to online public discourse and the monitoring of platform compliance with their commitments. We will publish a comprehensive analysis of the updated commitments soon! Stay tuned!