From Policy to Practice: DSA Implementation in Focus Across the EU

Executive Summary 

The impact of the Digital Services Act (DSA) across the European Union depends largely on the work of the national Digital Services Coordinators (DSCs). To understand how the law is translated into practice we held three focus group discussions with 12 stakeholders, including 8 DSC representatives from 6 member states. Our discussions highlighted several challenges in implementing the DSA at the national level:  

  1. Delays in passing national laws have led many DSCs to rely on informal enforcement, with political instability and budget issues slowing progress further. 
  2. Resource shortages and difficulties in recruitment compound the problem. 
  3. Confidence in the trusted flagger system is low, with the impression that the system has poorly designed incentives. 
  4. Collaboration with Civil Society Organisations (CSOs), critical for gathering evidence of non-compliance, is limited by funding and procedural obstacles. 
  5. Public awareness of the DSA remains insufficient, underscoring the need for targeted outreach and DSA awareness campaigns. 

To address these challenges, we recommend, among other things: 

  • Strengthening DSCs through targeted capacity-building initiatives, including training and practical resources.
  • Investing in public awareness campaigns to inform citizens about their rights and how to engage with the DSA.
  • Encouraging CSOs to form coalitions to advocate for the resources needed to support the enforcement of the DSA. These coalitions can work with donors to secure funding for digital rights projects and influence policy at both EU and national levels. 

Introduction 

The impact of the DSA is contingent on how effectively it is implemented at the national level, with DSCs playing a crucial role in overseeing and enforcing the DSA within their country. More than eight months after the DSA took full effect, this brief examines how DSCs across the EU are fulfilling their responsibilities, focusing on the key challenges they face. 

After conducting preliminary research, DRI organised three focus groups with key DSA implementation stakeholders between 27 September and 2 October 2024. The groups included 3 representatives from CSOs, 1 from an academic institution, and 8 DSC representatives from 6 small-to-medium member states. We tailored our questions to each group, focusing on the current status of DSA implementation in each country, the challenges DSCs are facing in meeting their responsibilities, and their plans for collaborating with external stakeholders, especially CSOs. We also asked for their views on how aware citizens in their countries are about the role of DSCs and their own digital rights. 

The discussions took place under the Chatham House Rule. The names of the participants and the name of the DSC or CSO they work for will not be disclosed. While we may refer to news and other secondary sources for context and mention specific countries, this does not imply that these countries participated in the focus groups. We do not attribute statements or actions to any country. 

DSCs: crucial actors in driving DSA implementation   

DSCs have multiple responsibilities under the DSA, which broadly fall into three categories: (1) enforcing DSA rules for intermediary services based in their country, (2) serving as the central hub for users’ complaints, and (3) certifying third-party actors involved in the DSA’s implementation. 

Enforcement responsibilities 

DSCs are responsible for monitoring and addressing non-compliance by non-Very Large Online Platforms (VLOPs) and non-Very Large Online Search Engines (VLOSEs) established in their country. DSCs also share with the Commission the competence to supervise VLOPs and VLOSEs established in their country, except for the “due diligence” obligations in Section 5 of Chapter III of the DSA, which are enforced exclusively by the Commission. 

Collaboration between DSCs, the Commission, and the European Board for Digital Services is key to this framework. Articles 57–60 of the DSA encourage these actors to share information, cooperate on investigations, and allow DSCs of establishment to lead joint investigations with other DSCs. 

With both direct and shared enforcement powers, DSCs play a vital role in DSA governance. For instance, they oversee platforms like Telegram, which, although not a VLOP, still serve as major hubs for issues such as hate speech, disinformation, and other harmful content. 

To carry out their enforcement duties, DSCs have broad powers. They can launch investigations into potential infringements by online platforms and other intermediary services, request information, and conduct inspections. Additionally, DSCs can accept commitments from platforms, impose fines or periodic penalty payments, and adopt interim measures. Moreover, if all other powers have been exhausted and the infringement continues, they can employ escalation measures according to article 51. 

DSCs also have indirect enforcement powers. Along with the European Commission, they form the European Board for Digital Services, an independent advisory body that contributes to the consistent application of the DSA. The Board coordinates joint investigations, analyses reports and audits of VLOPs and VLOSEs and advises the Commission on enforcement actions. It also promotes the development of European standards, guidelines, and codes of conduct, identifies emerging issues, and develops recommendations accordingly (Article 63). In this sense, the Board serves as a valuable forum for collaboration between stakeholders.1

Contact point for users’ complaints 

DSCs are also responsible for reviewing complaints from users or organisations acting on their behalf when they believe a service provider has violated DSA rules. 

Crucially, DSCs do not moderate content meaning they cannot order the removal of specific content. Their role is to ensure platforms comply with their obligations under the DSA. For example, when dealing with illegal content, DSCs do not determine if the content itself is illegal. Instead, they check if platforms were diligent in handling notices, or if their notice mechanisms are easy to access, user-friendly, and available electronically, as required by Article 16. As the Comimisún na Meán, the Irish DSC, states, their role is solely “to ensure platforms have complaint mechanisms in place and are operating them diligently”.2  Platforms decisions cannot be appealed to DSCs, which do not have decision-making power on this issue. Users have, however, other redress options available. They can file a complaint against the platform through its internal complaint-handling system, which is required under Article 20. Users can also turn to out-of-court dispute settlement bodies (Article 21) or directly appeal to Courts. 

Some examples of the grounds on which users can file complaints with DSCs include:3 

  • Problems with reporting illegal content (e.g. when platforms are not diligent in handling notices).
  • Issues dealing with account/content/service restrictions. According to Art. 17 DSA, online platforms should provide users with an explanation of why their accounts or content have been blocked or removed. Users can complain to the DSCs if, for example, platforms do not provide users with such “statement of reasons”.
  • Problems with the contact point or the legal representative of the online platform.
  • Problems with the terms of use / general terms and conditions (e.g. platform policies are not clear or easily accessible or platforms do not inform their users of significant changes to the Terms and Conditions).
  • Problems with out-of-court dispute resolution.
  • Problems with protection against misuse of user accounts.
  • Problems with online protection of minors.
  • Problems caused by misleading and manipulative presentation of the service (dark pattern).
  • Transparency problems with online advertising.
  • Transparency problems with recommender systems. 

Users can file complaints with their country’s DSC, which will review the case. If needed, the DSC will forward the complaint to the DSC in the country where the service is based, along with its opinion. Additionally, DSCs must produce annual reports that summarise the number of complaints received and how they were handled (Article 55). 

Certifying/vetting third-party actors involved in the implementation of the DSA  

Another key responsibility of DSCs is certifying or vetting third-party actors involved in the implementation of the DSA, including (a) trusted flaggers, (b) vetted researchers, and (c) out-of-court dispute resolution bodies. 

Trusted flaggers are special actors who can flag illegal content; their requests must be prioritised by providers of intermediary services. These actors must have expertise in finding and reporting illegal content, be independent of any online platform provider, and act responsibly, accurately, and objectively when submitting reports. DSCs are responsible not only for establishing the procedures to designate trusted flaggers but also for revoking this status if a trusted flagger no longer meets the required criteria. 

DSCs are also responsible for granting “vetted researcher” status to those applying for access to platform data, provided they meet the criteria outlined in Article 40(8) of the DSA. They also certify out-of-court dispute settlement bodies, independent organisations or entities that help resolve disputes between recipients of online services and online platform providers (article 21). These bodies offer a mechanism to address disagreements outside of the formal court system. This is without prejudice to the possibly for the parties to go to court, as these bodies’ resolutions are not binding (Art. 21 (2)). Only four of such bodies have been certified so far by DSCs.4 

Designated DSCs and the Rollout of DSA Laws 

The deadline for appointing DSCs was 17 February 2024, when the DSA fully came into force for platforms of all sizes. However, not all EU countries met this deadline. Some did not appoint the DSC authorities in time, while others did not give them the power to carry out their duties under the DSA. As a result, the European Commission initiated infringement proceedings against 12 member states on 24 April5 and 25 July.6 

In contrast, certain member states, like Ireland and Germany, have established particularly robust and well-resourced DSCs, positioning their DSCs to play a leading role in the supervision and enforcement of the DSA. 

In Germany, the DSA has been implemented through the Digitale Dienste Gesetz (DDG), which names the German Federal Network Agency (BNetzA) as Germany’s DSC. Other state agencies, such as the Federal Department for Media Harmful to Young Persons, and the Federal Commissioner for Data Protection and Freedom of Information, work closely with the BNetzA. One of the key innovations of the German DSC framework is the Advisory Council, an independent body of experts that supports the BNetzA in carrying out its tasks, and serves as a forum to discuss with academia, industry associations and civil society.7 

Ireland is another leading member state in DSA implementation. It plays a prominent role given that many of the relevant tech companies have their Europe HQs in the country. Ireland implemented the DSA through the “Irish Act”, which is fully enforceable since 17 February 2024. Coimisiún na Meán has been designated as DSC and was already the regulator for broadcasting and online safety in Ireland, established by the Online Safety and Media Regulation Act 2022. A second designated authority under the DSA is the Competition and Consumer Protection Commission (CCPC). Ireland allocated 6 million Euro to the DSC, and being a State of establishment to many large tech companies, it will likely see significant activity from both Coimisiún na Meán and the CCPC. The authority has already released guidelines for entities seeking certification as out-of-court dispute settlement bodies and trusted flaggers. Moreover, it recently launched a user contact centre, allowing individuals to inquire about the types of issues they can raise complaints about to the organisation.8 

Findings from Focus Groups 

While DSCs are encountering various challenges, positive outcomes and good practices have also emerged. In this section, we will explore key trends identified during our focus groups.  

Delays to or Absence of Implementing Laws  

Many countries have not yet passed legislation to implement the DSA at national level or even appointed their DSCs. Our focus groups revealed that four out of the six Member States with appointed, or soon to be appointed, DSCs, have not yet issued a DSA implementing law at national level. As a result, some appointed DSCs cannot proceed with recruitment, trusted flagger accreditation, and other procedures. They must also rely on ‘informal enforcement’, such as meetings and discussions with platforms, as they can’t yet carry out enforcement tasks or take decisions with legal consequences (e.g. information requests, sanctions, etc.).  

Some DSCs view the well-established Irish DSC as a model to follow. On the positive side, this approach could promote consistency across the EU in certifying DSA third-party actors. However, as one participant pointed out, it’s also crucial for each country to adapt these processes to fit the specific needs of the users and organisations in their jurisdiction. 

In addition, the political situation in different countries affects the pace and effectiveness of implementation. Several DSCs and CSOs noted how political setbacks and changes in government have affected the implementation of the DSA in their country, and how political priorities can change to push for a step forward (or backward) in implementation. 

Technical, Financial and Human Resources  

Article 50.1 of the DSA requires member states to ensure that their DSCs have the necessary resources—technical, financial, and human—to effectively carry out their tasks. Additionally, the regulation mandates that member states guarantee their DSCs have sufficient budgetary autonomy and independence. 

During the focus groups different DSCs expressed concerns regarding the amount of financial and human resources needed to effectively carry out their tasks. While this is partly due to the early stages of DSA implementation, it remains a significant issue. In some countries, budget cuts and delays in budget transfers have hindered recruitment. In others, political challenges, such as debates over the DSA’s objectives, have slowed the development of DSCs. Additionally, in countries where the DSA implementing law has not yet been issued, recruitment has been stalled. 

The challenge lies in the still-evolving agenda, which makes it difficult to predict the volume of complaints DSCs will receive. While the current number of complaints may be relatively low, this is likely to change as awareness campaigns are rolled out and users become more informed about their rights and the mechanisms available to them. This situation raises concerns about whether DSCs will have sufficient resources and expertise to manage the increased workload, with some DSCs fearing being overwhelmed and potentially paralysed by a flood of complaints. Many worry that the proposed funding is inadequate for the extensive tasks that DSCs will need to undertake in the future, particularly as the regulatory environment grows more complex and the demands on these bodies intensify. 

Our focus groups revealed that DSA-dedicated teams tend to be small, typically consisting of 5 to 10 members, with only one team exceeding 20 people. The quality of personnel is crucial, but a common challenge is identifying the right profile for DSA-related roles, which complicates the recruitment process. There is general consensus that DSCs require multidisciplinary teams, including experts to monitor social media on a large scale. The inclusion in DSCs of research and data units would also be useful, as it could assist DSCs in collecting and analysing data from platforms, as well as from trusted flaggers and other organizations.9 Currently, however, most DSCs are operating with their pre-existing human resources and reorganising their teams in an effort to enhance effectiveness in meeting their obligations. 

Trusted Flaggers  

So far, only 10 trusted flaggers have been accredited, mainly in Finland and Austria10. On 1 October 2024, Germany’s Federal Network Agency also accredited its first trusted flagger, the organisation REspect!11

Many DSCs have begun drafting procedures for accrediting trusted flaggers, often adopting or streamlining Ireland’s approach. Some CSOs, however, find Ireland’s requirements overly complex, arguing that they may result in lengthy and bureaucratic application process. In some countries, even well-established organisations have faced repeated rejections due to these hurdles. Unclear procedures also make it difficult to challenge accreditation decisions. Some experts raised concerns about platforms relying solely on official trusted flaggers, i.e. those that were designated by the DSCs, potentially overlooking the insights and expertise of other important stakeholders who do not seek trusted flagger status.  

Overall, there seems to be a lack of confidence in the trusted flagger system. The participants believe that incentives to apply for the status are not well designed—“Why would an NGO work for free?”. Many DSCs were originally organisations handling telecommunications and regulatory matters and therefore lack experience in dealing with illegal content. Trusted flaggers are thus crucial collaborators. Yet, for some competent organisations, applying for the trusted flagger status is not possible due to the bureaucratic and financial constraints. These challenges, along with other concerns such as potential conflicts of interest between trusted flaggers and platforms, have been identified as possible reasons for the low number of trusted flaggers.12 

A key debate centered on whether trusted flaggers should be allowed to flag content beyond what is deemed “illegal” under national laws. For example, if a country’s hate speech laws do not include gender as a protected category, trusted flaggers may be unable to flag such content. This issue was at the heart of a recent controversy in Germany. In the press release announcing the appointment of Germany’s first trusted flagger, the Bundesnetzagentur stated that “hate and fake news” could now be removed more easily. This sparked a heated public debate, with some critics accusing the trusted flagger framework of enabling state censorship.13 The Bundesnetzagentur had to clarify their statement, reassuring the public that they were referring only to illegal hate speech and illegal fake news. 

To address these concerns, the European Commission has commissioned a study to offer analysis and recommendations for the effective implementation of the trusted flaggers mechanism. The study aims to provide guidance on key issues related to granting and applying trusted flagger status, among other objectives14

Complaint Systems  

So far, DSCs have received relatively few complaints. Most of them, particularly those regarding VLOPs and VLOSEs, have been forwarded to the DSCs of establishment or to the European Commission. Complaints have been also referred to other relevant authorities, such as data protection authorities, when issues intersected with regulations like GDPR. 

In some countries, DSCs also serve as competition or media regulators, meaning their staff must carefully differentiate between complaints related to the DSA and those that fall under other frameworks, such as the Digital Markets Act (DMA) or consumer protection laws. This requires clear internal processes to ensure that each complaint is addressed according to the correct legal framework. 

There is a misunderstanding about the effect and scope of complaints under Art 53 DSA. Some people seem to assume that DSCs have the authority to directly order removal of illegal or harmful content from online platforms. This misconception can lead to unrealistic expectations from the media and the public. DSCs play a supervisory role, ensuring that platforms establish effective and user-friendly reporting mechanisms for illegal content and implement transparency measures. However, it is ultimately up to the platforms to act on this flagged content. For that matter, only judicial authorities have the power to order the removal of specific content.  

User Awareness 

A significant challenge tied to managing expectations around the role of DSCs is the general lack of public awareness regarding the mechanisms available to users. Participants shared that they have noticed many people are unsure which authority to approach with their complaints and are unfamiliar with the multiple regulatory bodies involved. There is also limited knowledge of the DSA itself and the protections it offers. This lack of awareness has likely contributed to the relatively low number of complaints received by DSCs so far. 

In response, some DSCs have initiated communication campaigns to raise citizen awareness, while others have expressed intentions to do the same. CSOs have taken the lead in educating the public through publications and webinars so far. Additionally, the European Commission seems to be currently preparing a social media campaign aimed at boosting awareness at the national level. 

Collaboration between DSCs and CSOs  

Multistakeholder cooperation is essential for the effective implementation of the DSA, requiring authorities to engage with counterparts and stakeholders at the national level, including CSOs. 

Our focus groups revealed that DSCs welcome cooperation with civil society organisations. Given capacity constraints, researchers are seen as valuable sources of information that can support enforcement efforts. CSOs are keen to collect evidence of violations but cite a lack of funding. Germany provides a positive example by planning to provide grants of up to €300,000 to support research—an important first step. CSOs can provide expertise and their involvement in gathering evidence on systemic risks is particularly valued by DSCs.  

CSOs also seem to be increasingly involved in consultations. Some CSOs have been invited to meet with DSCs and/or VLOPs/VLOSEs, while some DSCs have established informal stakeholder networks in which CSOs participate to enhance expertise and debate on specific issues. This seems to be following the steps of Germany’s decision to establish an Advisory Council, which held its first meeting last September. 

Cooperation among stakeholders can take multiple forms. Some participants suggested creating non-profit out-of-court dispute resolution bodies, while others proposed that CSOs lead training and capacity-building initiatives for DSCs. CSOs can also help with independently verifying user numbers to accurately classify VLOPs. Facilitating meaningful exchanges between DSCs and CSOs, such as through roundtables or working groups, is crucial, especially during sensitive times like elections. Additionally, CSOs can advocate for structural or procedural changes within DSCs. 

Recommendations  

To the European Commission: 

  1. Support DSCs through Capacity-Building Initiatives: The European Commission should continue assisting DSCs in fulfilling their duties under the DSA. The Commission can provide capacity-building resources such as toolkits or guidelines on managing issues such as illegal hate speech and disinformation.
  2. Invest in Public Awareness Campaigns: Current low levels of public awareness about the DSA and the role of DSCs contribute to the limited number of complaints received. The Commission should invest in large-scale public awareness campaigns to inform users about their digital rights and the services available to them under the DSA. These campaigns should inform the public on how to file complaints, understand platform responsibilities, and navigate the regulatory system, thus empowering citizens to hold platforms accountable.  

To Digital Services Coordinators: 

  1. Foster Collaboration Across DSCs for Good Practice Sharing: DSCs should collaborate closely with each other to share best practices, especially around handling complaints and data access requests. Countries with stronger DSCs, like Ireland and Germany, can act as models for others; however, each DSC should adapt such model to meet its country’s needs. These collaborations will ensure a more consistent application of the DSA across the EU, aligning with the cooperative framework emphasised in the DSA. Collaboration can also aid DSCs in understanding what the best profiles are to recruit.
  2. Engage with CSOs in a Consistent Manner. DSCs should deepen their engagement with CSOs, which are valuable partners in monitoring platform compliance and providing evidence on infringements. CSOs have raised concerns about funding and complexity in procedures, but their collaboration with DSCs has the potential to enhance enforcement efforts, as seen in Germany’s Advisory Council initiative.
  3. Simplify the Accreditation Process for Trusted Flaggers: DSCs should streamline the procedures for accrediting trusted flaggers. In some countries these processes are overly complex and bureaucratic, as in the case of Ireland and anyone copying it, leading to delays and rejections. Simplifying this process will increase the number of accredited trusted flaggers and the whole DSA implementation process.
  4. Run Campaigns to Raise Awareness of DSC Roles: DSCs need to run targeted awareness campaigns to inform users about their role, complaint procedures, and how to escalate issues.
  5. Advocate for Funding Schemes to Support Trusted Flaggers: Trusted flaggers are essential in moderating illegal content. DSCs should advocate for funding to support these organisations, which struggle with inadequate resources and are not offered financial incentives for their work.  

To Civil Society Organisations: 

  1. Strengthen Networking Across the EU: CSOs should enhance collaboration across EU countries by building networks to share challenges and good practices. 
  2. Form Coalitions to Advocate for Resources to Support DSA Enforcement: These coalitions can engage with donors to secure funding for projects supporting DSA enforcement at the EU and national levels.
  3. Incorporate Public Awareness in Projects: CSOs should include public education in their projects to raise awareness of digital rights and DSA mechanisms. Many users are unaware of their rights under the DSA, and CSOs can play a crucial role in bridging this knowledge gap through campaigns and publications. 

References

  1.  Jaursch, J., “More than an advisory group: Why the European Board for Digital Services has key roles in DSA enforcement. DSA Observatory”. 23 February 2024, https://dsa-observatory.eu/2024/02/23/more-than-an-advisory-group-why-the-european-board-for-digital-services-has-key-roles-in-dsa-enforcement/ (17 October 2024).
  2. Coimisiún na Meán. (n.d.), “Online complaints” https://www.cnam.ie/onlinecomplaints/ (15 October 2024).
  3. Bundesnetzagentur, Complaints portal, https://www.dsc.bund.de/DSC/DE/3Verbraucher/3VB/start.html
  4. European Commission, Out-of-court dispute settlement bodies under the Digital Services Act (DSA). Available in: https://digital-strategy.ec.europa.eu/en/policies/dsa-out-court-dispute-settlement#:~:text=Out-of-court%20dispute%20settlement%20bodies%20do%20not%20have%20the,but%20they%20offer%20a%20fair%20and%20swift%20review
  5. European Commission, “Commission calls on Cyprus, Czechia, Estonia, Poland, Portugal and Slovakia to designate and fully empower their Digital Services Coordinators under the Digital Services Act”, Press release, 24 April 2024, https://digital-strategy.ec.europa.eu/en/news/commission-calls-cyprus-czechia-estonia-poland-portugal-and-slovakia-designate-and-fully-empower
  6. European Commission, “Commission calls on 6 Member States to comply with EU Digital Services Act”. Press release. 25 July 2024, https://digital-strategy.ec.europa.eu/en/news/commission-calls-6-member-states-comply-eu-digital-services-act
  7. Bundesnetzagentur, “First meeting of the Advisory Council of the Digital Services Coordinator at the Bundesnetzagentur”, Press release, 18 September 2024, https://www.bundesnetzagentur.de/SharedDocs/Pressemitteilungen/EN/2024/20240918_DSC.html.  
  8. Kilroy, Deirdre, “The Irish Digital Services Act 2024 – The DSA is now fully enforceable in Ireland”, Bird & Bird, 13 February 2024, https://www.twobirds.com/en/insights/2024/ireland/the-irish-digital-services-act-2024-the-dsa-is-now-fully-enforceable-in-ireland#:~:text=The%20Digital%20Services%20Act%202024%20%28the%20%E2%80%9C%20Irish,now%20fully%20operative%20in%20Ireland%20from%2017%20February.  
  9. Jaursch, Julian, “Here is why Digital Services Coordinators should establish strong research and data units”, DSA Observatory, 3 October 2023, https://dsa-observatory.eu/2023/03/10/here-is-why-digital-services-coordinators-should-establish-strong-research-and-data-units/  
  10. European Commission, “Trusted flaggers under the Digital Services Act (DSA)”, https://digital-strategy.ec.europa.eu/en/policies/trusted-flaggers-under-dsa
  11. Bundesnetzagentur, “Bundesnetzagentur first approval for trusted flagger for online platforms in Germany”, 1 October 2024, https://www.bundesnetzagentur.de/SharedDocs/Pressemitteilungen/EN/2024/20240927_DSC_Trusted_Flagger.html?nn=691794.
  12. Goldberger, Inbal, “Europe’s Digital Services Act: Where Are All The Trusted Flaggers?”, Tech Policy Press, 13 May 2024, https://www.techpolicy.press/europes-digital-services-act-where-are-all-the-trusted-flaggers/.  
  13. Hoppenstedt, Max, “Bundesnetzagentur weist Berichte über Onlinezensur zurück”, Der Spiegel, 11 October 2024, https://www.spiegel.de/netzwelt/bundesnetzagentur-weist-berichte-ueber-online-zensur-zurueck-a-36ad6c31-f798-4e2d-9ce1-cf9910d8cc54.  
  14. European Commission, “Trusted flaggers under the Digital Services Act (DSA)”, https://digital-strategy.ec.europa.eu/en/policies/trusted-flaggers-under-dsa

Acknowledgments   

The brief is part of the DSA Tracker: monitoring Member States implementation project, funded by Civitates and conducted in partnership with the European Partnership for Democracy. The sole responsibility for the content lies with the authors and the content may not necessarily reflect the positions of the Network of European Foundations, or the Partner Foundations. 

Co-organised by Democracy Reporting International, Forum Transregionale Studien, 
Berliner Landeszentrale für politische Bildung and Verfassungsblog.

Thursday 20 February 2025
Revaler Str. 29, 10245 Berlin

18:30 – 20:00

Supported by

Civitates

Related posts