Europe's Big Tech Law Is Approved. Now Comes the Hard Part

Enforcing the sweeping Digital Services Act will be an uphill climb—and the European Commission is not yet equipped for it.
Photo collage of the Industry Commissioner of the EU a hand typing at a computer and a checklist
Photo-illustration: Jacqui VanLiew; Getty Images

The potential gold standard for online content governance in the EU—the Digital Services Act—is now a reality after the European Parliament voted overwhelmingly for the legislation earlier this week. The final hurdle, a mere formality, is for the European Council of Ministers to sign off on the text in September.

The good news is that the landmark legislation includes some of the most extensive transparency and platform accountability obligations to date. It will give users real control over and insight into the content they engage with, and offer protections from some of the most pervasive and harmful aspects of our online spaces.

The focus now turns to implementation, as the European Commission begins in earnest to develop the enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSCs). It will rely heavily on the creation of new roles, expansion of existing responsibilities, and seamless cooperation across borders. What’s clear is that as of now, there simply isn’t the institutional capacity to enact this legislation effectively.

In a “sneak peek,” the commission has provided a glimpse into how they propose to overcome some of the more obvious challenges to implementation—like how they plan to supervise large online platforms and how they will attempt to avoid the problems that plague the General Data Protection Regulation (GDPR), such as out-of-sync national regulators and selective enforcement. But their proposal only raises new questions. A huge number of new staff will need to be hired and a new European Centre for Algorithmic Transparency will need to attract world-class data scientists and experts to aid in the enforcement of the new algorithmic transparency and data accessibility obligations. The Commission’s preliminary vision is to organize its regulatory responsibilities by thematic areas, including a societal issues team, which will be tasked with oversight over some of the novel due diligence obligations. Insufficient resourcing here is a cause for concern and would ultimately risk turning these hard-won obligations into empty tick-box exercises.

One critical example is the platforms’ obligation to conduct assessments to address systemic risks on their services. This is a complex process that will need to take into account all the fundamental rights protected under the EU Charter. In order to do this, tech companies will have to develop human rights impact assessments (HRIAs)—an evaluation process meant to identify and mitigate potential human rights risks stemming from a service or business, in this case a platform—something civil society urged them to do throughout the negotiations. It will, however, be up to the board, made up of the DSCs and chaired by the commission, to annually assess the most prominent systemic risks identified and outline best practices for mitigation measures. As someone who has contributed to developing and assessing HRIAs, I know that this will be no easy feat, even with independent auditors and researchers feeding into the process.

If they are to make an impact, the assessments need to establish comprehensive baselines, concrete impact analyses, evaluation procedures, and stakeholder engagement strategies. The very best HRIAs embed a gender-sensitive approach and pay specific attention to systemic risks that will disproportionately impact those from historically marginalized communities.

This is the most concrete method for ensuring all potential rights violations are included.

Luckily the international human rights framework, such as the UN Guiding Principles on Human Rights, offers guidance on how best to develop these assessments. Nonetheless, the success of the provision will depend on how platforms interpret and invest in these assessments, and even more so on how well the commission and national regulators will enforce these obligations. But at current capacity, the ability of the institutions to develop guidelines and best practices and to evaluate mitigation strategies is nowhere near the scale the DSA will require.

Given the enormity of these tasks, it seems that the European Commission will have to put in place dedicated professional teams of qualified human rights experts with a deep understanding of human rights impact assessments. These independent teams would need to be supported by a breadth of additional expertise and knowledge to ensure their actions are inclusive and meaningful. As it stands now, no role is foreseen for the European Fundamental Rights Agency to provide such support, and the public consultations envisaged in the development of guidelines that will shape these mitigation measures will be limited at best.

The DSA notes the necessity for civil society’s input and expertise throughout the text, more so than any other text of its kind preceding it. It is clear that the commission will need said expertise in order to support the development and evaluation of such assessments. Quite simply, without the meaningful engagement of advocates in the implementation and enforcement of the entire DSA, the potentially groundbreaking provisions we have collectively worked so diligently to obtain in the text won't come to fruition.

Establishing and formalizing civil society as an implementation partner, along with the European Parliament, will increase accountability and public scrutiny and ensure that a human rights-centered approach to enforcement is implemented. The European Commission has already established advisory committees, or high-level expert bodies and working groups, to aid implementation of legislation in other areas, which are structures that we could draw inspiration from. These entities are far from perfect and would have to be redefined for the DSA context, but the wheel would not need to be reinvented in this case, just reimagined.

Enforcement of the DSA is going to be an uphill climb. Look no further than the ineffective and inconsistent cross-border cooperation when it comes to the GDPR. Unfortunately there’s no mechanism in the DSA to guarantee independence from political influence, and the depth of the challenges that lie ahead may not be fully understood for years. But it is not too late to rectify potential shortcomings.

As the EU institutions and national regulators build more substance into their enforcement strategies, they must acknowledge that if the DSA is to be the gold standard for online content governance, they must innovate and be bold in their approach. Their commitment to systematic engagement with civil society has been written into the law; they must realize this vision by building a collaborative approach to the enforcement mechanisms.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at [email protected].